Vacancy expired!
- Provide direction and lead our data engineering and architecture team, by determining optimal approach to the business demands.
- Implementing Data Analytics best practices in designing data modeling, ETL Pipelines, Near time data solutions.
- Coordinate with Business Analysts to validate requirements, perform interviews with users and developers.
- Implementing solutions to integrate external data with inhouse data.
- Perform tests and validate data flows and prepare ETL processes according to business requirements.
- Designing a multi-tenancy data platform.
- Lead Technical architecture discussions and help drive technical implementations.
- Designing and implementing a data conversion strategy from legacy to new platforms.
- Perform design validation, reconciliation and error handling in data load processes.
- 10+Years’ experience in Data Analytics & Business Intelligence projects
- Extensive knowledge of BI concepts (ETL, Dimensional Modeling, Data warehouse Design)
- Experience in creating Data Pipelines using Python/Spark , and using traditional ETL/ETL tools like Informatica, Pentaho
- Experience in designing Columnar Databases like Vertica, RedShift or Snowflake
- Experience in building monitoring & alerting mechanisms for data pipelines
- Working Knowledge in developing Integrations using APIs, such as REST, SOAP, JDBC/ODBC connections.
- Strong Analytical and Problem-solving skills, Excellent Written and Verbal Communication skills.
- Preferably AWS architect.
- Nice to Have – Experience in implementing Data Streaming process using Kafka or Kinesis and NoSql Databases.
- ID: #49350880
- State: South Carolina Columbia 29201 Columbia USA
- City: Columbia
- Salary: Depends on Experience
- Job type: Permanent
- Showed: 2023-02-26
- Deadline: 2023-04-25
- Category: Et cetera