Senior Data Engineer (Teradata, Hadoop)

13 May 2024

Vacancy expired!

Job Description:

ETL Developer(Teradata , Hadoop)

Location: Charlotte, NC

Duration: 12 months

Description:

Position Summary: We are seeking a highly technical and experienced Application ETL Developer / Data Engineer to support the design, architecture and development of Enterprise Information / Data Science Products owned and operated by the Enterprise Data Science Technology (EDST) team. This role will support application development on enterprise Teradata/SASGRID platforms. Qualified candidates must be well-versed in building ETL/analytics applications using complex SQL (Teradata) and Hadoop (Hive/Spark/Py Spark) based programs.

Responsibilities: Typically 7-10 years of experience. Develops, enhances, debugs, supports, maintains and tests data applications that support business units or supporting functions. These application program solutions may involve diverse data development platforms, software, hardware, technologies and tools. Supports end-to-end data-warehouse application architecture and development including requirements gathering, design, coding, testing and implementation of complex ETL and analytic applications and ETL workloads - SQL (Teradata) Work directly with product owners to understand requirements, and align to platform technology stack and design and document solutions / methods for data sourcing and provisioning As a Developer / Data Engineer, take accountability / ownership of the coding, testing and deployment framework for end-to-end ETL application development using agile methodology and SDM tools (JIRA, Bit Bucket) on the enterprise data platform(s) Supports systems through maintenance, modification, problem resolution to support ongoing delivery of application ETL and Analytics services and/or operations on the big data platform Serves as a fully seasoned/proficient technical resource - should be ready to get into the weeds of the code, analyze and research data and optimization problems, discuss technical details with the development/support team. Provides tech knowledge and capabilities as team member and individual contributor, but also responsible for instructing, directing, and checking the quality and timeliness of other systems professionals, including offshore resources. May lead multiple projects with competing deadlines. Works under minimal supervision, with general guidance from manager.

Required Skills: Computer Science/Software Engineering (or related) degree7+ years’ experience with end-to-end ETL and Analytics application development on Teradata-based data-warehouse and analytical platformsExtensive experience developing Teradata SQL-based ETL and analytic workflows using native utilities (bteq, tpt, fastexport)Very good knowledge of Unix/Linux shell Scripting and scheduling (like Autosys)Knowledge and experience working with CI / CD based development and deployment – JIRA, BitBucketExperience working with Big Data Technologies, programs and toolsets like Hadoop, Hive, Sqoop, Impala, Kafka, and Python/Spark/PySpark workloads will be a plus.Excellent written, communication and diagramming skillsStrong analytical and problem solving abilitiesSpeaking / presentation skills in a professional settingExcellent interpersonal skills and a team player to work all along with Global teams and business partnersPositive attitude and flexibleWillingness to learn new skills and adapt to changes.

Desired Skills: Industry certifications in SQL (Teradata) and Analytics (Tableau)

  • ID: #49925945
  • State: North Carolina Charlotte 28201 Charlotte USA
  • City: Charlotte
  • Salary: $60+
  • Job type: Contract
  • Showed: 2023-05-13
  • Deadline: 2023-07-01
  • Category: Et cetera