Vacancy expired!
- Extensive working experience with Hadoop Distributed File System (HDFS), Spark, Spark Streaming, Oozie, Hive, Kafka, Kafka Message Queues, Splunk, Flume, Pig, Scala, and Sqoop.
- The candidate will be responsible for Design, build and maintenance of security-enabled Big Data workflows/pipelines to process billions of records into and out of our Hadoop Distributed File System (HDFS).
- The candidate will engage in sprint requirements and design discussions.
- The candidate will be proactive in troubleshooting and resolving data processing issues.
- The candidate should be highly accountable, self-starter; possess strong sense of urgency; can work autonomously with limited direction.
- Extensive working experience with HDFS, Spark, Spark Streaming, Oozie, Hive, Kafka, Kafka Message Queues, Splunk, Flume, Pig, Scala, and Sqoop.
- Excellent knowledge and experience with Java/J2EE programming including Hadoop MapReduce.
- Experience with Object Oriented Design (OOD), Object-oriented programming (OOP) and development, data structures and design patterns.
- Strong knowledge, implementation skills, and experience with security processes involving Java-Hadoop communication, Spark Streaming, and Kafka.
- Experience working in highly productive Agile/Scrum software development environment using tools like Version One and ability to apply best practices with industry standards throughout the software development phase.
- Experience with REST API development using Spring Boot framework and JWT Security.
- Extensive experience with Linux environment and development using Shell Script and Python.
- Proficiency working in Cloud-based development environment using OpenShift containers and Jenkins CI/CD pipeline.
- Experience with traditional relational databases as well as NoSQL database design and troubleshooting (HBase, MongoDB, and PostgreSQL).
- Experience with version control software such as Git/Bitbucket.
- Experience with complex programming, program debugging, data analysis, problem analysis and resolution issues within HDFS.
- Communication skills to present ideas and concepts effectively; strong and proven problem-solving skills.
- Strong technical and analytical documentation skills.
- Self-starter, highly motivated individual, a team player who adapts to a dynamic work environment, and the ability to mentor others.
- Able to multi-task and work in a dynamic, fast-paced environment.
- Ability to investigate/research issues, determines impact, and provide solutionsStrong knowledge, implementation skills, and experience with security processes involving Java- Hadoop communication, Spark Streaming, and Kafka.
- Experience working in highly productive Agile/Scrum software development environment using tools like Version One and ability to apply best practices with industry standards throughout the software development phase.
- Experience with REST API development using Spring Boot framework and JWT Security.
- Extensive experience with Linux environment and development using Shell Script and Python.
- Proficiency working in Cloud-based development environment using OpenShift containers and Jenkins CI/CD pipeline.
- Experience with traditional relational databases as well as NoSQL database design and troubleshooting (HBase, MongoDB, and PostgreSQL).
- Experience running on a cloud environment with Cloud Computing such as AWS.
- Exposure to writing Technical White Papers and systems design documents.
- Prior experience with Federal or State government IT projects.
- Bachelor’s degree with 7+ years of hands-on experience or master’s degree with 5+ years of experience in a related field.