Vacancy expired!
- Strong experience with SSIS on SQL server, Linux/Unix scripting.
- At least 7 years' experience developing enterprise class data pipelines using Informatica PowerCenter, MS SQL Server, Control M.
- At least 2 years of experience working in Agile - Scrum teams.
- Good understanding of the Data warehousing concepts, SQL server, Oracle SQL, or any other database SQL expertise.
- Extensive experience designing and deploying SQL Server (ETL) solutions in a data warehousing, BI environment involving complex data transformations, change, data capture, and error handling.
- Advanced experience with SQL Server ETL Development and its related tools (SSIS, SSRS, SSMS, Visual Studio), including extracting, transforming, and loading data from various data sources and APIs.
- Advanced experience with production SSIS packages troubleshooting and development. Solid hands-on experience and knowledge of ETL technology in use in projects. Strong coding expertise with extensive knowledge and understanding of the systems development life cycle (SDLC).
- Excellent SQL programming is required as well (SQL, PowerShell, JavaScript, HTML, etc.) Solid hands-on experience and knowledge of database concepts of data modeling and mapping including normalization, dimensionality, referential integrity, indexes, keys, master data, and metadata.
- Advanced knowledge in applying appropriate test methodologies and QA processes which includes writing test plans and test cases.
- Experience in Python or other programming languages.
- Understand the business needs that translate them into a scalable, automated solution.
- Experience writing complex SQL queries, SQL Tuning and knowledge of database design and development (SQL and TSQL).
- Ability to create a ETL data flow diagrams based on the business requirements and pseudo logic.
- Strong technical background covering ETL environments with PowerCenter, Shell scripting, python.
- Should work with the multiple teams to understand the requirements, Design, Develop and test the interfaces.
- Strong Data Warehousing skills including Data profiling, Data masking, Auditing in ETL.
- Experience extracting data from a variety of sources, and a desire to expand those skills (SQL server, Oracle).
- Should have good coordination skills and Data Analysis skills.
- Should have experience with Unit and Integration Testing process.
- Any cloud experience on AWS, Azure, and snowflake with any relevant certifications.
- Designing processes that can extract/receive data from various heterogeneous source systems.
- Design, develop, test, and support new capabilities and ongoing changes within the various application data marts.
- Building capabilities to exchange the data with external partners in various formats (XML, CSV, Flat File, JSON).
- Perform data cleansing and transforming the data according to business rules.
- Developing reusable frameworks for data extraction, loading and cleansing.
- Design, develop, test, and support new capabilities and ongoing changes within the various application data marts.
- Work with data owners to document data mappings and transformations to support effective downstream analytics and alerts.
- Update the team on a daily or weekly basis in case of any project blockers and red flags
- Provide performance improvement related guidance for scalability and reusability of code and data objects.
- Raise software access requests to any database servers, tools and SharePoint portals required for the project.
- Apply specialized knowledge in assembly or integration, cross-discipline functions, knowledge engineering, industry expertise, or legacy evolution.