Data Engineer

30 Jun 2024

Vacancy expired!

DATA ENGINEER (REMOTE) Required Skills:

  • Bachelor's degree in Data Engineering, Computer Science, or related field.
  • Experience designing and implementing data engineering pipelines.
  • Advanced knowledge in Python and PySpark.
  • Working knowledge of one or more SQL languages.
  • 3+ years of hands-on experience with developing data warehouse solutions and data products.
  • 1+ year of hands-on experience developing a distributed data processing platform with Hadoop, Hive, Spark, Airflow, Kafka, etc.
  • 3+ years of hands-on experience in modeling and designing data schemas.
  • Advanced experience with programming languages: Python, Pyspark, Scala, etc.
  • Knowledge of scripting languages: Perl, Shell, etc.
  • Practice working with, processing, and leading large data sets.
  • Experience with cloud tools for ingesting and processing data.
  • Experience with AWS tools big data platforms - S3, EMR, EKS, Lambda, etc.
  • Experience with data ingestion and transformation tools like Streamsets and Databricks.
  • Experience working with DevOps teams.
  • Experience with container technologies such as docker and Kubernetes.
  • Experience with data warehousing tools like Snowflake and Redshift
Project Description:
  • The Data Platform team in our Company's Animal Health IT (MAHI-IT) designs and implements end to end data solutions to support customer facing applications in animal traceability, monitoring, well-being, and more.
  • We seek a data engineer to help the team setting up, maintaining, optimizing, and scaling data pipelines from multiple sources and across different functional teams in a cloud environment.
  • Assist in developing best practices for deploying, monitoring, and scaling data pipelines in the cloud.
  • Identify requirements for ingestion, transformation, and storage of data.
  • Design and implement optimal and scalable data pipelines.
  • Use cloud tools to integrate data from multiple data sources into the data lake and design and implement ways to expose it.
  • Identify opportunities for automation and optimization of data pipelines and re-design of data architecture and infrastructure for great scalability and optimal delivery.
  • Implement cloud/ data infrastructure required to extract, transform, and load data from multiple sources.
  • Identify required security and governance procedures to keep the data safe in a cloud environment.
  • Assist in developing and executing testing plans to help with QA efforts.
This 9+ month position starts ASAP.Please E-MAIL your resume (attachment to email) with rate and availability to Cindi: ALPHA'S REQUIREMENT #22-01934MUST BE ELIGIBLE TO WORK IN THE U.S. AS AN HOURLY W2 EMPLOYEE

  • ID: #43715466
  • State: New Jersey Madison 07940 Madison USA
  • City: Madison
  • Salary: BASED ON EXPERIENCE
  • Job type: Contract
  • Showed: 2022-06-30
  • Deadline: 2022-08-28
  • Category: Et cetera