Vacancy expired!
- Over All experience must be 12 + years’ experience
- Lead development of data pipelines to support operational processes, analytics and reporting use cases in an efficient and scalable manner.
- Provide technical guidance to drive the design and delivery of the organization’s data mart roadmap
- Partner with IT, the Business Intelligence team, and non-technical functional stakeholders to define needs and continuously improve the data platform.
- Design and build mechanisms to manage data from a wide variety of sources.
- Define and evaluate key tasks and acceptance criteria of data systems from development to production.
- Normalize complicated data sources to convert into usable formats.
- Write scripts to automate manual operational processes using disparate data sources.
- Responsible for optimizing and automating existing processes, refactoring code, establishing standards, writing documentation, and partnering with IT and DBAs to troubleshoot and minimize failures/downtime.
- Bachelor’s degree in Computer Science, Computer Engineering, or a related technical degree. Master’s preferred.
- 5+ years’ experience building data delivery services to support critical operational processes, reporting, and analytical models
- Demonstrated strength in SQL and Python, data modeling, data warehousing, ETL development, and process automation
- Working knowledge of cloud environments and server management; experience with AWS and Snowflake technologies
- Experience with query optimization, performance monitoring, and troubleshooting pipeline failures