Big Data/Big Query

11 Feb 2025

Vacancy expired!

W2 CONTRACT

NO C2C

You will be responsible for developing scalable big data pipeline solutions in the Google Cloud Platform Data Factory. In addition, you will: • Advanced knowledge of the Google Cloud Platform ecosystem with a focus on Big Query. • Designing and coding Big Query to analyze data collections • Analyze user needs to determine how software should be built or if existing software should be modified. • Participate in design, delivery estimates and code reviews • Develop and/or perform software automated testing procedures, solutions, and frameworks to ensure software functions as needed • Translate business requirements and specifications into usable and scalable software • Process and understand capabilities and limitations of data outputs from the software • Understand and assist with the technical infrastructure of an application or system • Determine and execute the software deployment process and troubleshoot performance issues. • Develop data quality and validation routines • Build distributed reliable and scalable data pipelines to ingest and process data in real-time and other unstructured data. 5+ years of experience with the following: - Data design, data architecture and data modeling (both transactional and analytic) - Building Big Data pipelines for operational and analytical solutions - Running and tuning queries in databases including Big Query, SQL Server, Hive - Data Management - including running queries and compiling data for analytics - Experience with developing code in one or more languages such as Java, Python and SQL - 2+ year of experience with the following: - Google Cloud Platform Cloud data implementation projects experience (ApacheBeam, SpringBoot, Dataflow, Airflow, BigQuery, Cloud Storage, Cloud Build, Cloud Run, etc.) - In depth understanding of Apache Beam and Spring boot framework - Agile methodologies
  • Preferred Skills / ExperienceCertification: Google Professional Data Engineer • 7 years of experience in a Software engineering role building complex data pipelines for operational and analytical solutions. • Experience programming and producing working models or transformations with modern programming languages • Experience designing and deploying large scale distributed data processing systems with few technologies such as Oracle, MS SQL Server, MySQL, PostgreSQL, MongoDB, Cassandra, Redis, Hadoop, Spark, HBase, Teradata, Tableau, Qlik or Other
  • Education RequiredBachelor’s degree in Computer Science, Computer Engineering, Information Technology, or equivalent experience
  • Position Type
  • Education PreferredMaster's degree in Data Science, Computer Science, or Data Engineering • Certification: Google Professional Data Engineer CONTACT

  • ID: #49149053
  • State: Michigan Dearborn 48120 Dearborn USA
  • City: Dearborn
  • Salary: $60 - $70
  • Job type: Permanent
  • Showed: 2023-02-11
  • Deadline: 2023-04-09
  • Category: Et cetera