Vacancy expired!
- Bachelor’s degree in Engineering (preferably Analytics, MIS or Computer Science). Masters degrees preferred.
- Minimum 7 years of Data Analytics and Data Lake ETL Experience using tools like Talend, Hadoop PIG and Scala scripting, AWS technologies, Microsoft BI stack (SQL, SSIS, SSAS and Multi-dimensional).
- Data Concepts (ETL, near-/real-time streaming, data structures, metadata and workflow management)
- Big Data (Hadoop/EMR, Hive, Map-Reduce, Kinesis/Oozie, Sqoop, Spark)
- Programming / Scripting (Python, Java, C/C, Scala, Bash, Korn Shell)
- Bachelor’s degree in Engineering (preferably Analytics, MIS, or Computer Science). Masters degrees preferred.
- Minimum 7 years of Data Analytics and Data Lake ETL Experience using tools like Talend, Hadoop PIG and Scala scripting, AWS technologies, Microsoft BI stack (SQL, SSIS, SSAS and Multi-dimensional).
- Data Concepts (ETL, near-/real-time streaming, data structures, metadata, and workflow management)
- Big Data (Hadoop/EMR, Hive, Map-Reduce, Kinesis/Oozie, Sqoop, Spark)
- Programming / Scripting (Python, Java, C/C, Scala, Bash, Korn Shell)
- Markup Languages (JSON, XML, YAML)
- Code Management Tools (Git/GitHub, Bit Bucket, SVN, TFS)
- Minimum 2 years on active Big data development experience.
- Good knowledge of cloud deployments of BI solutions including use of the AWS eco-system.
- Preferred - experience of developing backend data solutions for front end tools like Tableau, PowerBI, and/or QlikSense.
- Ability to pull together complex and disparate data sources, warehouse those data sources and architect a foundation to produce BI and analytical content, while operating in a fluid, rapidly changing data environment
- Experience with Terraform or other CI/CD automation.