Vacancy expired!
The Senior Data Engineer will be responsible for implementing data integration services related to the enterprise data lake-house, data hub, and building data services related visualization layers. This individual will be responsible for developing data integration programs based on AWS Services (Glue, Python, Step Functions, Snowflake, HVR, Fivetran, Talend, etc.). This role requires a deep technical understanding of AWS Glue, Snowflake, Data Engineering technologies including data modeling, optimization, data extraction, loading, transformation, and reporting. This role will manage resolution of complex issues and participate in the design and implementation of new technical solutions and tools. Responsibilities Collaborate with business stakeholders, technical and cross-functional project teams to implement data analytics solutions based on AWS services, Snowflake, HVR and Analytics tools Lead large-scale data warehousing and analytics projects. Develop innovative solutions to complex business and technology problems Design, build and implement automation solutions for application builds, testing and deployment for cloud services. Define and implement AWS and Snowflake best practices Identify and resolve data, technical issues and mediate business implications with efficiency, effectiveness and with consistently high quality. Collaborate with other team members in the design, implementation, and documentation of solutions for new projects. Collaborate with colleagues in Global Data Management team in the design, implementation, and documentation of solutions for daily issues/support, release management, and new projects. Collaborate with colleagues in Global Data Management and Cloud teams to improve and optimize global infrastructure for Data Analytics applications and to provide high performance and scalable solutions to meet needs of the Global Users. Collaborate with Infrastructure and Cloud teams to define and implement backup, disaster recovery and archival strategy. Collaborate with Global Data Management, Infrastructure and Cloud teams to proactively evaluate the capacity and resource utilization of various Data Analytics services, storage, and security. Required Experience 7+ years of hands-on design and development experience implementing Data Warehouses. 5+ years of hands-on design and development experience in implementing Data Analytics applications using AWS Services such as S3, Glue, AWS Step Functions, Kinesis, Lambda, Lake Formation, Athena, Elastic Container Service/Elastic Kubernetes Service, Elastic Search, and Amazon EMR or Redshift or Snowflake 5+ years of experience with ETL tools like Talend, Alteryx, HVR/Fivetran Experience of Data Lake/Snowflake platform implementation, including hands-on experience (implementation and performance tuning) in HVR/Talend implementations. Expertise in building data ingestion services from disparate real time and batch data sources, inclusive of streaming sources, APIs, logs, XML files, JSON files, text files, relational, and non-relational databases. Expertise in SQL and NoSQL databases, data warehouses, and data processing Hands-on working knowledge of Python, JavaSript/Typescript, AWS CDK, JSON Understanding of database and analytical technologies in the industry including NoSQL databases, Data Warehouse , BI reporting and Dashboard development Experience ingesting data from different sources and/or ETL development experience Experience in implementing data analytics applications based on data from SAP ECC. Prior experience in designing Dimensional Data models. Experience with Business Intelligence/Visualization tools (Power BI or Business Objects or SAP Analytics Cloud or Tableau).
- ID: #49179616
- State: New Jersey Unionbeach 07735 Unionbeach USA
- City: Unionbeach
- Salary: BASED ON EXPERIENCE
- Job type: Permanent
- Showed: 2023-02-13
- Deadline: 2023-04-11
- Category: Et cetera