Vacancy expired!
- Design and develop ETL (Talend) / SQL / Python based processes to perform complex data transformation processes.
- Design, code, and test major data processing features, as well as work jointly with other team members to provide complex software enhancements for the enterprise data storage platforms (RDBMS and No-SQL platforms)
- Build Data Integration solutions to handle batch / streaming / IoT data on ETL, Big-Data platforms.
- Develop and Deliver changes in the Enterprise Data Warehouse according to Data Warehousing best practices
- Gather requirements and construct documentation to aid in maintenance and code reuse in accordance with team processes and standards
- Monitor scheduled jobs and improve reliability of ongoing processing
- Monitor, measure, and enhance ways to improve system performance
- Ability to multi-task deliverables and manage them in an efficient manner.
- Performs other duties as assigned
- 2+ years hands-on experience with an ETL tool, preferably Talend, designing and unit testing processes
- 2+ years' experience with SQL or SQL-like language, writing complex efficient queries and troubleshooting as needed
- Good Relational DBMS technology - SQL Server a plus
- Experience dealing with multiple data formats, reading, parsing, etc. (JSON, delimited, XML, etc.)
- Able to gather requirements and document as needed
- Experience with shell scripting / Python
- Good understanding and expertise in relational database concepts and data processing concepts
- Hands on experience designing and implementing data ingestion techniques for real time processes (IoT, eCommerce) a plus.
- Strong communication skill
- Good analytical and problem-solving skills
- Able to work on multiple priorities in a deadline-driven environment
- Candidate must be thorough and detail-oriented
- Development experience in a Big-Data environment a Plus; Spark, Kafka, Message Queues
- Knowledge of CRM, MDM, and Business Intelligence a plus