Vacancy expired!
JOB Details :
- Bachelor’s or Master’s degree in a technology related field (e.g., Computer Engineering, Computer Science, etc.)
- Desire and ability to learn and implement new technologies
- Keen ability to see complex challenges from multiple perspectives, and leaning in to solve independently or with others
- Knowledge of how to effectively use multiple types of databases such Relational databases (Oracle, PostgreSQL, etc.), NoSQL databases (DynamoDB, Elastic search) and Graph databases (Neptune, Neo4J, etc.)
- Hands-on experience building scalable, resilient, and cost-effective data products (preferably on AWS and Snowflake)
- Hands on experience in programming or scripting using Unix, Python, Java, etc.
- Thorough understanding of Data modeling and data integration patterns
- Experience in developing batch jobs (preferably on AWS event bridge, step function, S3, Lambda, EC2, ECS/EKS, etc.)
- Able to schedule or orchestrate batch process and workflow (Airflow, Argo, cron, etc)
- Demonstrated experience developing, debugging, and tuning complex SQL statements, PL/SQL packages & procedures
- Experience in developing ETL / ELT data pipeline
- Knowledge of Messaging Technologies (Kafka, Kinesis, SNS, SQS)
- Experience with DevOps or CI/CD Pipelines using Maven, Jenkins, AWS CFTs, uDeploy, Stash, Ansible, etc.
- Experience in managing testing, code deployment and release management process
- Ability to validate, monitor, and solve issues during development, testing, or in production
- Ability to work effectively in global teams distributed across geographic locations in an Agile way
- Excellent communication skills, both through written and verbal channels
- Proven knowledge of AWS via Associate, Professional, or Specialty Certification(s) a big plus
- ID: #48829279
- State: North Carolina Raleigh / durham / CH 27601 Raleigh / durham / CH USA
- City: Raleigh / durham / CH
- Salary: Depends on Experience
- Job type: Contract
- Showed: 2023-01-26
- Deadline: 2023-03-25
- Category: Et cetera