Sr. Data Engineer - Hybrid position - Saint louis,MO

25 Mar 2024

Vacancy expired!

Hi, Job Title:

Sr. Data Engineer ( 12+ Years is a must)Duration: Long Term Contract.Location:

St Louis, MO ( In Person 2 days Preferred, Remote Okay if candidate is exceptional)

Must Have : Strong in Scala and Spark
  • 12+ Years experience is a must
As a Senior Data Engineer in the Data Engineering & Analytics team, you will develop data & analytics solutions that sit atop vast datasets gathered by retail stores, restaurants, banks, and other consumer-focused companies. The challenge will be to create high-performance algorithms, cutting-edge analytical techniques including machine learning and artificial intelligence, and intuitive workflows that allow our users to derive insights from big data that in turn drive their businesses. You will have the opportunity to create high-performance analytic solutions based on data sets measured in the billions of transactions and front-end visualizations to unleash the value of big data.You will have the opportunity to develop data-driven innovative analytical solutions and identify opportunities to support business and client needs in a quantitative manner and facilitate informed recommendations/decisions through activities like building ML models, automated data pipelines, designing data architecture/schema, performing jobs in big data cluster by using different execution engines and program languages such as

Hive/Impala, Python, Spark, R, etc.Your Role
  • Drive the evolution of Data & Services products/platforms with an impact-focused on data science and engineering
  • Designing machine learning systems and self-running artificial intelligence (AI) software to automate predictive models.
  • Ensuring that algorithms generate accurate user recommendations.
  • Turning unstructured data into useful information by auto-tagging images and text-to-speech conversions.
  • Solving complex problems with multi-layered data sets, as well as optimizing existing machine learning libraries and frameworks.
  • Provide support for deployed data applications and analytical models by being a trusted advisor to Data Scientists and other data consumers by identifying data problems and guiding issue resolution with partner Data Engineers and source data providers.
  • Ensure proper data governance policies are followed by implementing or validating Data Lineage, Quality checks, classification, etc.
  • Discover, ingest, and incorporate new sources of real-time, streaming, batch, and API-based data into our platform to enhance the insights we get from running tests and expand the ways and properties on which we can test
  • Experiment with new tools to streamline the development, testing, deployment, and running of our data pipelines.
  • Maintain awareness of relevant technical and product trends through self-learning/study, training classes and job shadowing.
  • Participate in the development of data and analytic infrastructure for product development
  • Continuously innovate and determine new approaches, tools, techniques & technologies to solve business problems and generate business insights & recommendations
  • Partner with roles across the organization including consultants, engineering, and sales to determine the highest priority problems to solve
  • Evaluate trade-offs between many possible analytics solutions to a problem, taking into account usability, technical feasibility, timelines, and differing stakeholder opinions to make a decision
  • Break large solutions into smaller, releasable milestones to collect data and feedback from product managers, clients, and other stakeholders
  • Evangelize releases to users, incorporating feedback, and tracking usage to inform future development
  • Ensure proper data governance policies are followed by implementing or validating Data Lineage, Quality checks, classification, etc.
  • Work with small, cross-functional teams to define the vision, establish team culture and processes
  • Consistently focus on key drivers of organization value and prioritize operational activities accordingly
  • Escalate technical errors or bugs detected in project work
  • Maintain awareness of relevant technical and product trends through self-learning/study, training classes, and job shadowing.

Ideal Candidate Qualifications
  • Superior academic record at a leading national university in Computer Science, Data Science, Computer Engineering, Technology, or a related field or equivalent work experience
  • Expertise in Data Engineering and implementing multiple end-to-end DW projects in Big Data environment
  • At least 5 years of experience as a data engineer or machine learning engineer and with open-source tools
  • Prior experience in working in product development/management role
  • Experience in building and deploying production level data driven applications and data processing workflows/pipelines
  • Experience with application development frameworks (Java/Scala, Spring)
  • Experience with data processing and storage frameworks like Hadoop, Spark, Kafka
  • Experience implementing REST services with support for JSON, XML and other formats
  • Experience with performance Tuning of Database Schemas, Databases, SQL, ETL Jobs, and related scripts
  • Experience of working in Agile teams
  • Good analytical skills required for writing and performance tuning complex SQL queries, debugging production issues, providing root cause, and implementing mitigation plan
  • Ability to quickly learn and implement new technologies, and perform POC to explore best solution for the problem statement
  • Flexibility to work as a member of a matrix based diverse and geographically distributed project teams
  • Strong project management skills
  • Experience in building and deploying production-level data-driven applications and data processing workflows/pipelines and/or implementing machine learning systems at scale in Java, Scala, or Python and deliver analytics involving all phases like data ingestion, feature engineering, modeling, tuning, evaluating, monitoring, and presenting
  • Curiosity, creativity, and excitement for technology and innovation
  • Demonstrated quantitative and problem-solving abilities
  • Ability to multi-task and strong attention to detail
  • Motivation, flexibility, self-direction, and desire to thrive on small project teams
  • Good communication skills - both verbal and written – and strong relationship, collaboration skills, and organizational skills
The following skills will be considered as a plus
  • Financial Institution or a Payments experience a plus
  • Batch processing and workflow tools such as NiFi
  • Experience in developing integrated cloud applications with services like Azure, Databricks, AWS or Google Cloud Platform
  • Experience in managing/working in Agile teams
  • Experience developing and configuring dashboards

  • ID: #49544529
  • State: Missouri Saintlouis 63102 Saintlouis USA
  • City: Saintlouis
  • Salary: $70 - $90
  • Job type: Contract
  • Showed: 2023-03-25
  • Deadline: 2023-05-23
  • Category: Et cetera