Data Engineer (Kafka / Python)

19 Feb 2025

Vacancy expired!

For a financial client we need Data Engineer (Kafka / Python). This position is based in Westlake, TX or Smithfield, RI or Merrimack, NH or Durham, NC . We are Primarily looking for W2 Candidates and not looking for Third Party Candidates. The Expertise we're looking for

  • Strong experience with relational database technologies (Oracle SQL & PL/SQL or similar RDBMS), preferably Snowflake or other Cloud Data platforms.
  • Proficiency in any programming language such as Python, Java. You write clean, well-tested code, and you're passionate about coding
  • Strong expertise in all aspects of data movement technologies (ETL/ELT) and experience with schedulers.
  • Practical experience delivering and supporting Cloud strategies including migrating legacy products and implementing SaaS integrations.
  • Proven experience in understanding multi-functional enterprise data, navigating between business analytic needs and data, and being able to work hand-in-hand with other members of technical teams to execute on product roadmaps to enable new insights with our data.
  • Crafted and implemented operational data stores, as well as data lakes in production environments.
  • Experience with DevOps, Continuous Integration and Continuous Delivery. Developing and deploying pipelines. Deploying within a cloud native infrastructure would be advantageous.
  • Able to work collaboratively with a geographically diverse team, also providing technical leadership for junior members of the team.
  • Strong understanding of security aspects
The Skills you bring
  • Proven track record of working in collaborative teams to deliver high quality data solutions in a multi-developer agile environment following design & coding best practices.
  • Outstanding SQL skills and experience performing deep data analysis on multiple database platforms.
  • Ability to develop ELT/ETL pipelines to move data to and from Snowflake data store using combination of Python and Snowflake SnowSQL.
  • Knowledge and expertise of data modelling techniques and best practices (Relational, Dimensional), plus any prior experience with data modelling tools (eg. PowerDesigner).
  • Prior experience with Data ingestion tool sets (e.g Apache NiFi, Kafka) is advantageous.
  • Experience in working with AWS, MS Azure or other cloud providers. Experience with AWS services such as Lambda or S3, AWS Certification.
  • Experience in Data Architecture (Database design, performance optimization).
  • Prior experience in setting up reliable infrastructure (Hardware, Scalable data management systems, and frameworks) to perform data-related tasks, particularly with Kafka.
  • Familiarity with modern infrastructure technologies such as Docker, Kubernetes
  • Strong Focus on resiliency & reliability.
  • You have excellent written and oral communication skills.
The Value you deliver
  • Simplifying and effectively communicating technical challenges, solutions options, and recommendations to business partners and technology leadership.
  • Provide technical leadership and support in data and solutioning to team members (coaching others to their full potential).
  • Produce scalable, resilient, cloud-based systems design aligned with our long-term strategy.
  • Collaborate with chapter leads, squad leads, tech leads and architects on setting the technical roadmaps.
  • Recognizing opportunities to bring emerging technologies to deliver innovative solutions to meet business challenges.
  • Understand detailed requirements and deliver solutions that meet or exceed customer expectations.
  • Take ownership and accountability.
  • Adaptability to a changing and challenging workload.

  • ID: #49305071
  • State: North Carolina Durham 27709 Durham USA
  • City: Durham
  • Salary: USD TBD TBD
  • Job type: Permanent
  • Showed: 2023-02-19
  • Deadline: 2023-04-18
  • Category: Et cetera