Snowflake Data Engineer

11 Nov 2024

Vacancy expired!

Job Title: Snowflake Data EngineerLocation: Durham, NC Duration: Long termJob Description:THE LOCATION: Durham NC, but open to other locations.THE PROJECT:Work on data strategy project that wants to be hosted on Snowflake, they want to expand into a data lake and they want to bring all their data warehouses into Snowflake itself. They have a SaS product, they have oracle. They are trying to now currently decommission SaS and they hope to soon get rid of Oracle Data warehouse as well. They will then bring them both into Snowflake. They got rid of Hadoop because it is no longer a preferred technology. So the overall strategy switched to Snowflake.They don’t really need an architect because the Snowflake environment is already built out, they need someone who can do the customizations and development within the framework.THE SELL:This is an opportunity to implement snowflake architecture in a new environment and help bring an entire BU into current state inline with enterprise wide goals.MUST HAVE:Looking for a combination designer and developer for the framework for data ingestion into Snowflake. Is there a better way to design the data pipeline for the system?They are in a data warehouse environment and they need to be able to design for the snowflake data schema. Once the architecture is built they can maintain and keep the database running.Need someone who can create a cloud-native data lake using Snowflake and a fully automated ETL ingestion pipeline using Snowpipe. If the have experience with this they will be a fit for this role.The RoleAs a Sr. Data Engineer, your role will involve the following:• Designing, Developing, Enhancements, Testing and Support batch processes, applications, and data movement technologies• Implement scalable designs and software engineering excellence practices• Assess and research the current implementation of the platforms and define the course of action for modernization• Providing technical vision and leadership; hands-on technology solutions implementation to meet business requirements• Collaborate with peers and the leadership team on process improvement ideas, policy and procedure enhancements, and opportunities to improve the customer service experienceThe Expertise you haveBachelors or Masters in a technology related field (e.g., Computer Science, Engineering etc.) required.• 8+ years of related experience in data engineering, analysis, data warehouses, data lakes. Specialist understanding and experience of methodologies like data warehousing, data visualization and data integration.• A solid experience and understanding of designing and operationalization of large-scale data and analytics solutions on Snowflake Cloud Data Warehouse is a must.• Solid experience with relational database technologies (Oracle SQL & PL/SQL or similar RDBMS), preferably Snowflake.• Strong expertise in all aspects of data movement technologies (ETL/ELT) and experience with schedulers.• Experience in migrating On-prem databases to AWS Cloud & Snowflake using AWS services like EC2, S3, EMR• You should be having an experience in Data Lake implementation in Snowflake• Knowledge and expertise of data modeling techniques and standard methodologies (Relational, Dimensional), plus any experience with data modeling tools (e.g., Power-Designer).• Prior experience with Data ingestion tool sets (e.g., Apache NiFi, Kafka) is advantageous.• Working experience with some or all of the following: AWS, Containerization, associated build and deployment CI/CD pipelines, Lambda development.• Experience with Business Analytics (preferred tool is Tableau)• Able to work collaboratively with a geographically diverse team.• Proven track record of working in collaborative teams to deliver high quality data solutions in a multi-developer agile environment following design & coding standard methodologies.• Outstanding SQL skills and experience performing deep data analysis on multiple database platforms.• Understanding data transformation and translation requirements and which tools to use to get the job done• Prior experience in setting up reliable infrastructure (Hardware, Scalable data management systems, and frameworks) to perform data-related tasks, particularly with Kafka.• Proven Focus on resiliency & reliability.• You have excellent written and oral communication skills.• Python, Unix, or Java scripting experience is nice to have

With RegardsKoteshDirect: Mobile: Desk#: EXT 174

HCL Global Systems, Inc24543 Indoplex Circle|Suite 220|Farmington MI

  • ID: #22566459
  • State: North Carolina Durham 27701 Durham USA
  • City: Durham
  • Salary: Depends on Experience
  • Job type: Contract
  • Showed: 2021-11-11
  • Deadline: 2022-01-09
  • Category: Et cetera