Vacancy expired!
PETADATA is looking for a
Senior Data Engineer(AWS,ETL , Kafka & Python) to work with one of our clients. Job Location: San Francisco, CA (On-site) Work Authorization: W2, Full-time Visa: H1B Visa, TN Visa, EAD. Note: In a week 2-4 business days will work on-site & Local Candidates will be preferable PETADATA is looking for candidates to provide design & development projects in large or complex ETL Frameworks & Data API python packages projects. Involving in application development, migrations, integrations, enhancements to existing applications. Roles & Responsibilities:- Candidates can do the design and development of the Data Warehouse from End to End lifecycle.
- To build distributed backend application in the AWS cloud.
- Understand repeatable automated processes for building the application, testing it, documenting it, and deploying it at scale.
- A desire to work as part of a growing, fast-paced, and highly-flexible team. Ability to quickly learn new technologies and adapt to a fast-paced development environment.
- You can coordinate with Data Engineering developers and Database Administrators resulting in effective data-driven solutions.
- You can work with, and incorporate feedback from, product designers and other stakeholders in the company.
- To establish quality processes to deliver a stable and reliable solution.
- Understand the project proposal and assist the team in analyzing how the new system or functionality can be integrated with the current environment.
- Ability to identify and resolve any performance and/or data-related issues
- Provide documentation (Data Mapping, Technical Specifications, Production Support, data dictionaries, test cases, etc.) for all projects
- Candidates should have 8+ years of experience as ETL Developers.
- 5+ years of hands-on coding experience in Java, Python, pandas, Django, and Stream processing services like Kafka, AWS Kinesis, Apache Storm, Spark Streaming, etc.,
- 4+ years of experience in CI/CD tools other dependency management tools & build tools like Jenkins, Gradle, Maven, Ant, and Ivy.
- 4+ years of experience in Development Methodologies, Databases Platforms, and Data Modeling tools (ERwin/Model Manager).
- 3+ years of experience in Cloud implementation & Restful API & Server-side API integration tools.
- 3+ years of experience in building data warehouse solutions and Data Modeling.
- Expert-level skill in modeling, managing, scaling, and performance tuning high volume transactional database
- Should Be able to establish scalable, efficient, automated processes for dataset analysis, model development, and validation.
- Expertise in relational and NoSQL databases with batch data processing with SFTP/SSH, Unix/Linux.
- Financial services industry experience Will be helpful.
- Must have Project Management skills to lead and guide the project teams.
- Bachelor's/ Master’s degree in Computer Science, Engineering, or a related field.
- Candidates must have skills in AWS, Python, ETL, Apache Kafka, Restful API & snowflake.
- ID: #23509581
- State: California Sanfrancisco 94101 Sanfrancisco USA
- City: Sanfrancisco
- Salary: Depends on Experience
- Job type: Permanent
- Showed: 2021-11-26
- Deadline: 2022-01-24
- Category: Software/QA/DBA/etc