Data Analytics and Reporting Architect

19 Feb 2025

Vacancy expired!

Essential Duties And Responsibilities:• Helping to design, deploy, manage and support the systems and infrastructure required for a data processing pipeline in support of a products requirements.• Primary responsibilities revolve around DevOps and include implementing ETL (extract, transform and load) pipelines, monitoring/maintaining data pipeline performance.• Plan, design, and monitor these key characteristics of the analytics platform to help ensure that it complies with enterprise standards and that it performs adequately as additional analytics solutions are implemented.• Responsible for selecting appropriate technologies from the many open source, commercial on-premises, and cloud-based offerings available. Integrating a new generation of tools within the existing environment is also crucial to help ensure access to accurate and current data• Big data analytics architecture often needs to accommodate many and sometimes conflicting requirements and constraints. Requirements come from diverse stakeholders, such as line-of-business users, data scientists, analysts, and administrators.• Collaboration with various technical and architecture teams across countries to define a common, scalable, and global solution architecture.• Stay up to date on industry trends and maintains external relationships to develop state of the art knowledge in emerging technologies and to stay abreast of the latest thinking in technology arenas.• Performing discovery, analysis, and reporting on a snapshot of data in a one-off, stand-alone fashion is valuable• Help maintain the integrity and security of the company database• Analyze database implementation methods to make sure they are in line with company policies and any external regulations that may apply• Demonstrate initiative, self-motivation, creative problem-solving and effective interpersonal/communication skills that support the process of building collaboration between teams• Design Architecture diagrams that will help the team and integration partners to understand solution and vision.• Influence peers and technical staff within the team and collaborate with internal customers and work teams across departments. Basic Qualifications:• Bachelor's degree or equivalent combination of education and work experience in software development• Development experience on Azure Data Factory Objects - ADF pipeline, Integration runtime, configurations• 5+ years of BI development experience using MS technology stack• 3+ years of experience developing both multidimensional and tabular models with large and complex datasets

Minimum Qualifications:• Experience in working on Azure Data Lake is desirable• Hands-on knowledge of ADF activities ADF data ingestion and integration with other services Experience in SQL, Stored Procedures, indexes• Strong proficiency in DAX, SQL, Excel, Power BI• Successful track record delivering high-quality products on time while working in Agile Teams and following Agile methodologies• Able to coordinate technical standards within a development team• Strong verbal, written, and presentation skills• Strong Object-Oriented concepts• Deep understanding and practical experience on implementing best practices for technical design and development• Agility and quick learner, understanding of the Software development lifecycle• Azure Synapse, Azure Data Catalog, Machine Learning and statistical modeling experience is a plus

  • ID: #49303854
  • State: Arizona Glendale 85301 Glendale USA
  • City: Glendale
  • Salary: Depends on Experience
  • Job type: Contract
  • Showed: 2023-02-19
  • Deadline: 2023-04-18
  • Category: Et cetera