Vacancy expired!
- Perform data analysis for production issues, identify root cause and triage & track the issue with the relevant application and platform teams.
- Be responsible for issue resolution and verify solution implemented.
- Mitigate issue recurrence by incorporating and implementing quality checks to the interfaces.
- Provide in-depth troubleshooting skills to assist in resolving errors and performance issues including tier 2 production support.
- Track ongoing business queries and resolution, collaborate with business stakeholders and product owners to operationalize new products/enhancements.
- Support end-user trainings, drive continuous operations optimization and streamlining, manage vendor operations and track operational performance metrics.
- Provide ongoing operations status updates to data management leadership and business stakeholders.
- Maintain operations tickets in Jira and provide ongoing calculation of team velocity, sprint/release burndowns and ticket resolution metrics.
- Review operations schedule with the identified project team and product owners, functional area stakeholders and associated executive management that are involved or will be affected by the operations and activities
- Establish a communication plan and schedule to update stakeholders including appropriate staff in the organization on the progress of the operations
- 6+ years writing SQL and /or Python. Hands on experience required.
- 2+ years working with cloud technologies like AWS/Databricks
- 2+ years working in ETL & data warehousing environments.
- Strong knowledge of relational database systems, columnar databases.
- Strong knowledge and understanding of data warehouse integration processes, analysis and reporting tools.
- Excellent oral and written communications skills, business acumen and excels in problem solving and analytical skills
- Up-to-date specialized knowledge of data wrangling, manipulation and management of technologies to affect change across business units.
- Ability to work in an agile and rapid changing environment with high quality deliverables.
- Experience working in Agile environments and using Jira for project management.
- Understanding of Web Services as well as JSON formats.
- Understand the concepts of Hadoop and Spark.
- Additional Languages: Python, Scala
- Experience with data formats including Parquet.
- Experience using Spark (Scala or PySpark)
- Understanding of AWS (S3, EC2, Redshift, EMR, Athena)
- Experience with tools like Databricks, Qlik, Tableau, Power BI
- Understanding or Application of Machine Learning and / or Deep Learning
- Experience in a Pharma Commercial domain.
- Experience using a Data Lake/ Data Warehouse environment.
- Bachelor's Degree in Information Systems or related discipline preferred.
- Certifications in relevant areas preferred.
- ID: #49371804
- State: Massachusetts Cambridge 02138 Cambridge USA
- City: Cambridge
- Salary: Up to $90
- Job type: Contract
- Showed: 2023-02-27
- Deadline: 2023-04-21
- Category: Writing/editing