Vacancy expired!
Data Engineer
Hybrid, NYC Compensation Range : $80-$90/hour Responsibilities- Creates and maintains optimal data pipeline architecture integrating large, complex data sets that meet functional and non-functional business requirements.
- Identifies, designs, and implements internal process improvements such as automating manual processes, optimizing data delivery, and re-designing infrastructure for greater scalability.
- Builds the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS cloud native technologies.
- Builds analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Works with project stakeholders to assist with data-related technical issues and supports data infrastructure needs.
- Works with data and analytics team to strive for greater functionality in our data systems.
- Participates in special projects and performs other duties, as required.
- Bachelor's degree in Computer Science, Statistics, Informatics, Information Systems or a related field required.
- Minimum of five years of experience working on advanced SQL knowledge, relational databases, and query authoring (SQL) required.
- Experience building and optimizing big data pipelines, architectures and data sets required.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement required.
- Strong analytic skills related to working with structured/unstructured datasets required. Experience building processes supporting data transformation, data structures, metadata, dependency and workload management required.
- Working knowledge of message queuing, stream processing, and highly scalable data stores required.
- Experience with relational SQL and NoSQL databases, such as Oracle, SQL Server, MySQL, Postgres, MongoDB and Snowflake required.
- Experience with data pipeline and workflow management tools such as AWS Glue, and Airflow required.
- Experience with AWS cloud services such as EC2, RDS, S3 & Athena, Redshift, and Dynamo required.
- Experience with stream-processing systems including PySpark, Storm, and Spark-Streaming required.
- Experience with object-oriented/object function scripting languages such as Python, Java, C, and Scala required.
- ID: #49505989
- State: New York New york city 10001 New york city USA
- City: New york city
- Salary: USD TBD TBD
- Job type: Permanent
- Showed: 2023-03-20
- Deadline: 2023-05-18
- Category: Architect/engineer/CAD