Vacancy expired!
- Maintain existing Databricks Notebooks, ADF Pipelines, and Azure Function App. Build new capabilities on top of the existing data extraction framework and pipelines
- Troubleshoot log analytics, and provide production support as needed. Work in Agile development setting and collaborate closely with other team members on deliverables and dependencies.
- Build channel contracts and data consumption patterns for customer facing (On-line/Mobile) channels
- Analyze and validate data sharing requirements within and outside data partners
- Expert and key point of contact between the operational data hubs and the channel contracts for On-line/Mobile
- Contribute to technical documentation and WIKI pages.
- Support creating and managing of CI/CD build and deployment pipelines.
- Provide documentation for Production Support SMEs.
- Expert and key point of contact between testers, and requirement teams
- Work directly with key leaders to understand data requirements across the data lake, data hub, APIs, and analytical data warehouse
- Perform other duties as assigned
- Bachelor’s degree in Information Systems, Computer Science, Engineering, or related field, or the equivalent combination of education, training, and experience
- Databricks Notebooks, Python, Spark/PySpark, Azure Cloud Basics, Azure Data Factory, Azure Functions, SQL Database, CI/CD DevOps
- Azure Cosmos DB, Azure Logic Apps, Event-driven Architecture, Big Data Frameworks, Azure Dev Ops (ADO)
- Proficient in enterprise API management products such as Mulesoft
- Proficient in Web Services, REST APIs, Event and Pub/Sub messaging architecture
- Financial Industries and large banks