Vacancy expired!
We are seeking a Senior Data Pipeline Engineer for a consulting engagement with a global, household-name client. This position can be in either: Los Angeles, Seattle, or NYC.
Skills- Data Modelling – various data models including Datawarehouse designs
- Exceptional SQL – SQL, PL/SQL / T SQL / etc.
- Python plus expert JavaScript / typescript
- Airflow (APACHE workflow management tool for building data pipelines)
- Drive Automation of testing for data pipelines in CI/CD Environment (DevOps – CI = Continuous Integration)
- Financial Accounting Applications
- Reporting and Analytics
- Develop technical story backlog derived from high level business requirements and design collaboration and estimating story points. (Create Technical specifications and time estimates/dependencies from high level business requirements)
- Building Data Lake / Data warehouse solutions using AWS (S3, Athena, Glue, and EMR)
- Binary Data Serialization – Parquet preferred (converting data structure objects and program objects into binary data that can be stored or streamed or sent to other locations, where it can be resurrected to being identical to the objects originally serialized)
- GIT
- AWS - Managed Airflow and Glue
- Ad Tech platforms such as Operative and STAQ
- Understanding of SOX compliance needs
- Airflow Operator types, including REST, Lambda, ECS
- Aurora/Hive (databases)
- Spark (large-scale data processing)
- Airflow (workflow management)
- Docker (software packaging and delivery)
- AWS (development and hosting)
- ID: #40720246
- State: Washington Seattle-tacoma 98101 Seattle-tacoma USA
- City: Seattle-tacoma
- Salary: Depends on Experience
- Job type: Contract
- Showed: 2022-05-12
- Deadline: 2022-07-10
- Category: Et cetera