Data Engineer

05 Nov 2024

Vacancy expired!

We are looking for a data engineer to join our engineering team in support of our data management and rationalization strategy. You will be joining a dynamic development team in the Technology and Operations business with a mandate to transform and accelerate our software delivery pipelines. We are building modern integrations like streaming APIs, and also starting the process of modernizing our data and DevOps pipelines towards a unified data strategy. The ideal candidate will have a background in Data Engineering, and will be knowledgeable and passionate about their work. Apart from your technical knowledge and curiosity, your interpersonal communication skills, and the ability to articulate and enunciate complex concepts will be other strong assets that will help you succeed in this position.

What will you do?

Support Development and DevOps in T&O You will support the Engineering, DevOps and data teams in building and enhancing data pipelines. This includes:

  • Establishing DevOps tooling to aggregate, cleanse, transform, ingest data from various sources, enrich and disseminate to environments where they can be modelled and analyzed.
  • assisting migration of existing data aggregation ETL techniques to new data pipelines
  • operationalizing streaming data flows and setting up streaming data processors
  • setting up the pipeline tools for scalability and monitoring
  • collaborating with data scientists in prototyping and operationalizing machine learning models

What do you need to bring?

Required EXPERIENCE AND SKILLS

  • At least 3 years of experience in Software Engineering or Analytics
  • Excellent skills and at least 2 years of past work experience with
  • Setting up many to many data pipelines with Apache Kafka and Kafka connect
  • Running and optimizing SQL queries on RDBMS like MS SQL server, MySQL/MariaDB or HIVE
  • At least one NoSQL database like MongoDB, Cassandra or HBase
  • Java and/or Bash scripting
  • Knowledge of enterprise data architectures, distributed and microservice software architectures and design patterns

Nice to have

  • Experience with
  • ELK stack including logstash and ElasticSearch
  • MapReduce, Hive or Spark
  • container-based virtualization e.g. kubernetes or docker
  • BI reporting using Tableau
  • Working knowledge of
  • Producing and consuming REST services
  • ORM frameworks like Hibernate
  • CI/CD tools like Jenkins, UCD or Ansible, Git repositories like GitHub or BitBucket
  • Knowledge of authorization frameworks like OAuth 2

What's in it for you? We thrive on the challenge to be our best, progressive thinking to keep growing, and working together to deliver trusted advice to help our clients thrive and communities prosper. We care about each other, reaching our potential, making a difference to our communities, and achieving success that is mutual.

  • Leaders who support your development through coaching and managing opportunities
  • Ability to make a difference and lasting impact
  • Work in a dynamic, collaborative, progressive, and high-performing team
  • A world-class training program in financial services
  • Flexible work/life balance options
  • Choice of computer: Windows or Mac
  • Casual dress code
  • Agile work environment
  • Work with a highly engaged and motivated team

  • ID: #22211041
  • State: North Carolina Raleigh / durham / CH 27601 Raleigh / durham / CH USA
  • City: Raleigh / durham / CH
  • Salary: $120.00 - $120.00 per annum
  • Job type: Permanent
  • Showed: 2021-11-05
  • Deadline: 2022-01-03
  • Category: Et cetera