Google Cloud Platform Hadoop developer

09 Nov 2024

Vacancy expired!

find below job description and let me know if you are interested. Data Hadoop Developer Location: Remote till Covid After Covid Candidate's need to be on site on any of these locations Dallas, TX/Tampa, FL/Jersey City, NJ Duration: Long Term Primary Skills: Google Cloud Dataflow Hadoop Ecosystem Big Data Requirements: Google Cloud Platform Big Query Google Cloud Platform Data Proc Python, SQL Job Description: Build and maintain data management workflows. Build and maintain Data ingestion pipelines for batch, micro-batch and real time streaming on big query with Google Cloud Google Cloud Platform Certified developer on Biq Query and Data Proc Experience in building Data ingestion pipelines for batch, micro-batch and real time streaming on big data/Hadoop platforms Hands on experience on Hadoop big data tools HDFS, Hive, Presto, Apache Nifi, Sqoop, Spark, Log Stash, Elastic Search, Kafka & Pulsar Experience in collecting data from Kafka/Pulsar message bus and transporting the data to public/private cloud platform using NiFi, Data Highway and log stash technologies Experience in building CI/CD pipeline & Dev Ops is preferred Development experience with Agile Scrum/Safe methodology find below job description and let me know if you are interested. Google Cloud Platform - Data Hadoop Developer Location: Remote till Covid After Covid Candidate's need to be on site on any of these locations Dallas, TX/Tampa, FL/Jersey City, NJ Duration: Long Term Primary Skills: Google Cloud Dataflow Hadoop Ecosystem Big Data Requirements: Google Cloud Platform Big Query Google Cloud Platform Data Proc Python, SQL Job Description: Build and maintain data management workflows. Build and maintain Data ingestion pipelines for batch, micro-batch and real time streaming on big query with Google Cloud Google Cloud Platform Certified developer on Biq Query and Data Proc Experience in building Data ingestion pipelines for batch, micro-batch and real time streaming on big data/Hadoop platforms Hands on experience on Hadoop big data tools HDFS, Hive, Presto, Apache Nifi, Sqoop, Spark, Log Stash, Elastic Search, Kafka & Pulsar Experience in collecting data from Kafka/Pulsar message bus and transporting the data to public/private cloud platform using NiFi, Data Highway and log stash technologies Experience in building CI/CD pipeline & Dev Ops is preferred Development experience with Agile Scrum/Safe methodology

  • ID: #22433842
  • State: Texas Plano 75094 Plano USA
  • City: Plano
  • Salary: USD 55/hr 55/hr
  • Job type: Contract
  • Showed: 2021-11-09
  • Deadline: 2022-01-07
  • Category: Web/HTML/info design