Vacancy expired!
- Looking for a Senior Big Data Engineer/Architect on Google Cloud Platform to help strategize, architect and implement various solutions to migrate data hosted on our on-prem platform to Google cloud Platform (Google Cloud Platform). The architect will design and implement enterprise infrastructure and platforms required for setting up data engineering pipelines utilizing the tools available on the Google Cloud Platform Platform. As a Google Cloud Platform Platform Architect - You will work on Advanced Data Engineering products using Google Big Data technologies such as GCS, Data Proc, Airflow, Data Store and Big Query.
- Very strong leadership and communication skills exhibiting right negotiating posture with customer and program teams to make the right decisions.
- Experience leading one or more of the following areas of a Cloud transformation journey: strategy, design, application migration planning and implementation for any private and public cloud.
- Cloud foundation design and build/implement
- Cloud Transformation & Migration
- Cloud Managed service (IaaS and PaaS)
- Cloud foundation design and build/implement
- Google Cloud Certified Professional Cloud Architect Certification
- Bachelor's degree with 3-5 years' experience on Google cloud with deep understanding, design and development experience with Google Cloud Platform products on Infrastructure, Data management, Application Development, Smart Analytics, Artificial Intelligence, Security and DevOps
- Extract, Transform and Load (ETL) & Big Data Tools: BigQuery, Cloud Dataflow, Cloud Proc, Cloud Pub/Sub, Cloud Composer, Google Data Studio, Google Cloud Storage.
- NoSQL databases: Cloud Bigtable, Cloud Fire store, Firebase Realtime Database, Cloud Memory store. Search Technologies: Lucene and Elasticsearch
- Relational Databases: Cloud Spanner, Cloud SQL
- Strong knowledge on Google cloud storage Data lifecycle management
- Strong knowledge on BIGQuery Slots management
- Cost optimization for Dataproc workload management
- Experience of designing, building, and deploying production-level data pipelines using tools from Hadoop stack (HDFS, Hive, Spark, HBase, Kafka, NiFi, Oozie, Splunk etc).
- Development and deployment technologies (e.g. JIRA, GitHub, Jenkins, Nexus, Artifactory)
- Bachelor's degree with 3-5 years' experience
- ID: #43768795
- State: New Jersey Piscataway 08854 Piscataway USA
- City: Piscataway
- Salary: Depends on Experience
- Job type: Contract
- Showed: 2022-07-02
- Deadline: 2022-08-12
- Category: Et cetera