Vacancy expired!
Level 1Job Description:
Databricks Operations is part of the Hadoop Support team. This team is responsible for managing the Databricks plant across the company. Working within an Agile delivery (Scrum/Kanban/Scrumban)/ DevOps methodology. Support the application development teams and help them in debugging and fixing an issue.Responsibilities and activities of these team include, but not limited to:- Build, Upgrade and maintain Hadoop clusters with several nodes.- Monitoring and troubleshooting of services running on Hadoop clusters.- Performance tuning.- Setup of backup and recovery.- Troubleshooting User issues which include User on boarding, job failures.- Installing and integrating of a new services onto the Hadoop cluster.- Working with Vendor to discuss/apply issue bugs, patches and issues.- Deploy and automate the implementations/fixes using Ansible scripts.- Hands-on experience on the administration side of DataBricks preferably on Azure.- Prior experience in a support role on admin side of Databricks on one of Azure/AWS/Google Cloud Platform cloud- Excellent knowledge of git and Jenkins. Understanding of distributed systems and databases, cloud computing environments.- Azure/Linux(mandatory skills).- General familiarity with Docker and Kubernetes concepts.- Hands-on experience in Azure stack (Azure Data Lake, Azure Data Factory, Azure Databricks).- Good understanding of other Azure services like Azure Data Lake Analytics & U-SQL, Azure SQL DW.- Demonstrated analytical and problem-solving skills, particularly those that apply to a big data environment.- Working within an Agile delivery (Scrum/Kanban/Scrumban)/ DevOps methodology.- Deploy Azure Databricks workspaces using IaC (terraform + azure devops).Qualification:- Experience in Cloudera Hadoop distribution CDH 6.x and CDP 7 is preferred.- Experience with Cluster maintenance tasks like adding and removing nodes, enabling High availability, installing services, applying patches.- Unix/Linux knowledge including the ability to understand hardware, Operating system and network settings.- Experience with Hadoop Ecosystem components which include HDFS, YARN, Hive, Impala, Spark, Sqoop, Kafka, Flume and Solr.- Unix Shell, perl or python scripting.- Kerberos and LDAP integration experience.- TLS/SSL certificates knowledge to enable encryption cross Hadoop services.- Some development experience in Databricks on Azure/AWS/Google Cloud Platform.- Some developer skills in Python.- Some experience with Terraform for IaaC.- ID: #49262575
- State: Georgia Alpharettageorgia 00000 Alpharettageorgia USA
- City: Alpharettageorgia
- Salary: Depends on Experience
- Job type: Contract
- Showed: 2023-02-17
- Deadline: 2023-04-17
- Category: Et cetera