Senior Data Engineer

02 Dec 2024

Vacancy expired!

PETADATA is looking for a

Senior Data Engineer(AWS,ETL , Kafka & Python) to work with one of our clients.

Job Location: San Francisco, CA (On-site)

Work Authorization: W2, Full-time

Visa: H1B Visa, TN Visa, EAD.

Note: In a week 2-4 business days will work on-site & Local Candidates will be preferable PETADATA is looking for candidates to provide design & development projects in large or complex ETL Frameworks & Data API python packages projects. Involving in application development, migrations, integrations, enhancements to existing applications.

Roles & Responsibilities:
  • Candidates can do the design and development of the Data Warehouse from End to End lifecycle.
  • To build distributed backend application in the AWS cloud.
  • Understand repeatable automated processes for building the application, testing it, documenting it, and deploying it at scale.
  • A desire to work as part of a growing, fast-paced, and highly-flexible team. Ability to quickly learn new technologies and adapt to a fast-paced development environment.
  • You can coordinate with Data Engineering developers and Database Administrators resulting in effective data-driven solutions.
  • You can work with, and incorporate feedback from, product designers and other stakeholders in the company.
  • To establish quality processes to deliver a stable and reliable solution.

  • Understand the project proposal and assist the team in analyzing how the new system or functionality can be integrated with the current environment.
  • Ability to identify and resolve any performance and/or data-related issues
  • Provide documentation (Data Mapping, Technical Specifications, Production Support, data dictionaries, test cases, etc.) for all projects

Required Skills:
  • Candidates should have 8+ years of experience as ETL Developers.
  • 5+ years of hands-on coding experience in Java, Python, pandas, Django, and Stream processing services like Kafka, AWS Kinesis, Apache Storm, Spark Streaming, etc.,
  • 4+ years of experience in CI/CD tools other dependency management tools & build tools like Jenkins, Gradle, Maven, Ant, and Ivy.
  • 4+ years of experience in Development Methodologies, Databases Platforms, and Data Modeling tools (ERwin/Model Manager).
  • 3+ years of experience in Cloud implementation & Restful API & Server-side API integration tools.

  • 3+ years of experience in building data warehouse solutions and Data Modeling.
  • Expert-level skill in modeling, managing, scaling, and performance tuning high volume transactional database
  • Should Be able to establish scalable, efficient, automated processes for dataset analysis, model development, and validation.
  • Expertise in relational and NoSQL databases with batch data processing with SFTP/SSH, Unix/Linux.
  • Financial services industry experience Will be helpful.
  • Must have Project Management skills to lead and guide the project teams.

Preferred Qualifications:
  • Bachelor's/ Master’s degree in Computer Science, Engineering, or a related field.
  • Candidates must have skills in AWS, Python, ETL, Apache Kafka, Restful API & snowflake.

If you are interested and meet the above job requirements, please submit your resume to

After carefully reviewing your experience and skills, one of our Hiring team members will contact you on the next steps.

  • ID: #23748374
  • State: California Sanfrancisco 94101 Sanfrancisco USA
  • City: Sanfrancisco
  • Salary: Depends on Experience
  • Job type: Permanent
  • Showed: 2021-12-02
  • Deadline: 2022-01-30
  • Category: Software/QA/DBA/etc