Vacancy expired!
We are seeking an immediate need for a Senior Engineer of Data Platforms to join a world-renowned management consulting firm. This is a direct hire, hybrid capable role. You will join their Waltham or Atlanta office and be a part of their One Firm Tech-Cloud Data & Analytics team. You will work with product managers, software engineers, architects, and various platform teams. You’ll be part of a team responsible for delivering technology enabled solutions of the future. You will be involved in all business value chain activities from understanding product needs to product development to on-going maintenance and enhancement.
What You Will Do:- You will be responsible for creating innovative interoperability platforms, tools, and solutions to enable seamless and secure data integration.
- In this role, your solutions will be used to connect legacy, newly developed, and vendor applications across the datacenter and cloud environments, and you will be responsible for the full lifecycle of the solutions.
- You will be developing specifications, designing infrastructure and interfaces, developing code. As Engineers & Senior Engineers you will be 80% hands on coding and Senior Engineers will also Coach & Mentor.
- You will be responsible to design and build scalable, secured ETL pipelines in PySpark.
- You will develop complex PySpark code using SparkSQL, Dataframes, joins, transposes etc. to load data to a MPP Datawarehouse: Snowflake.
- You will utilize your good understanding of Python and Spark coding concepts e.g., SparkSQL, Dataframes, joins, transposes etc.
- You will create ETL data pipelines using PySpark to read Kafka topics, RDBMS, APIs and other sources and load into object storage (e.g., S3 etc.)
- Bachelors/Master’s degree in technology related field
- 5+ years of IT experience with about 3+ years in data engineering and ETL/ELT
- Experience in Java/J2EE tech stack.
- Experience in designing and developing data pipelines using PySpark in any Public Cloud e.g., AWS, GCP, Azure etc. or hybrid environments.
- Proficient in SQL, data modelling and data warehouse concepts.
- Proficient in developing Microservices using Java, SpringBoot and REST API Creation and Consumption.
- Conceptual understanding of modern software engineering patterns, including those used in highly scalable, distributed, and resilient systems.
- Solid understanding of NoSQL databases like MongoDB and ElasticSearch, experience in Kubernetes, docker and CI/CD pipeline configuration.
- Experience in developing and delivering systems on AWS cloud platform or equivalent.
- Experience working on AWS SDK and Lambdas, any MPP data warehouses e.g., Snowflake, Big Query, Redshift, GraphQL API development, AWS Glue, Glue Studio, Blueprints
- Experience in implementing effective and successful Cloud based Data Migration and Data Integration strategies
- ID: #43309649
- State: Massachusetts Boston 30306 Boston USA
- City: Boston
- Salary: USD0 - USD0
- Job type: Permanent
- Showed: 2022-06-19
- Deadline: 2022-08-17
- Category: Et cetera