AWS Data Engineer (Datalake)

06 Feb 2025

Vacancy expired!

Job Purpose:Responsible for performing difficult or complex analysis, design and programming involving multi-project leadership and broad responsibility in support of new and existing Data Engineering Projects and production support activities. Core Duties

  • Collaborate with IT and business partners to devise a data strategy that caters to Stanford requirements.
  • Build Data and Technology inventory and drive architecture.
  • Deep understanding and commitment to software engineering principles/processes (e.g. Lean, Agile, DevOps, CI/CD) and continuous improvement through measurement
  • Thorough knowledge, expertise and practice Data Management Framework to design world class data stores. Best practices, Data Quality and security are critical.
  • Understand data endpoints, consumers and develop strategy.
  • Fluid end-to-end data vision, design pipelines for seamless data flow.
  • Lead and perform the design, development, implementation and maintenance of complex Data Store/ Data Lake/Lake house and Data warehousing systems and data-intensive solutions that are scalable, optimized, and fault-tolerant.
  • Design and implement Data Migration and Data Integration across cloud and hybrid environments.
  • Mastery and hands-on experience with Data Engineering technologies and scripting languages. Identify new technologies and provide recommendations to Management.
  • Solid understanding and experience in Cloud technologies and applications. Data Migration, Integration, API’s development, Data Streaming ( Batch and continuous) and scheduling.
  • Lead and Mentor Junior Data engineers and support with best design practices.
  • Data Modeling skills. Able to come up with a Canonical Data Model and simplify data flow and interaction between different applications. Should be able to integrate new sources smoothly.
  • Ability to translate complex functional and technical requirements into detailed architecture, design and high performing software.
  • Design, build and optimize pipelines for data collection for storage, access and analytics.
  • Out of the box thinking to overcome engineering challenges with innovative design principles.
Minimum RequirementEducation and Experience:
  • Bachelor's degree and eight years relevant experience or a combination of education and relevant experience.
Knowledge, Skills and Expertise:
  • Thorough understanding and experience in Data Lake, Lake House and Data Warehousing Architecture. Should be able to suggest, architect and implement Data Lake/Lake house/ DataWarehouse solution with a set of available cloud tools and programming.
  • Experience with DataOps and related set of practices, processes and technologies.
  • Experienced in Data Migration and Data Integration. Know the pain points in Data integration across SaaS applications and implement the best solution that fits the organization.
  • Hands-on experience and expertise in Advanced SQL, Advanced Python programming, AWS and Other Data Engineering tools, SnowFlake, Informatica, SnapLogic, KAFKA, Airflow, Oracle Cloud data lake and other open source tools. Experience in Data Migration/Integration tools such as AWS Data migration services,AppFlow, MuleSoft, RJ, open source and any other market available tools
  • Experience in writing reusable complex Python scripts for ELT, Business Logic OR APIs. Other Coding experience such as Scala and R Programming is a plus.
  • Hands-on development work on all aspects of data analysis, data provisioning, modeling, performance tuning and optimization.
  • Experience in working on AWS cloud environment, using the marketplace for the right tool, efficient utilization of them to meet business requirements.
  • Mastery of relational, NoSQL or NewSQL database systems. Expertise in working with unstructured, structured and semi-structured data.
  • Build scalable data pipelines for both real time and batch using best practices in data modeling, ETL/ELT processing using various technology stack
  • Experience in designing and implementing tight data security at various levels.
  • Experience in streaming data from SaaS/PaaS applications - SalesForce, ServiceNow, Workday, Oracle Cloud, Marketo and others.
  • Experience in data migration and integration across cloud and on-premises systems
  • Constantly monitor operations, tune for better performance and utilization.

  • ID: #49039269
  • State: California Redwoodcity 94061 Redwoodcity USA
  • City: Redwoodcity
  • Salary: $80
  • Job type: Contract
  • Showed: 2023-02-06
  • Deadline: 2023-03-17
  • Category: Et cetera