Vacancy expired!
- Experience of designing, building, and deploying production-level data pipelines using tools (HDFS, Hive, Spark, HBase, Kafka, NiFi, tc.).
- Must have, strong experience working on Event Driven Architecture
- Provide expertise in Kafka brokers, zookeepers, KSQL, KStream and Kafka Control center
- Must have, strong knowledge of the Kafka Connect framework, with experience using several connector types HTTP REST proxy, JMS, File, SFTP, JDBC etc
- Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API
- Must have, implementing KAFKA consumer to read data from KAFKA Partitions. K-SQL , API ,Kafka Security (SSL SASL Kerberos ACL) ,Elasticsearch and advance configuration
- Must have experience in handling huge volumes of streaming messages from Kafka
- Must have experience Kubernetes/OpenShift (AWS Elastic Kubernetes Service, Azure Kubernetes Service, Red Hat OpenShift preferred )
- Comfortable building services with a CI/CD pipeline, CircleCI, Jenkins,
- Experience with NiFi and understanding of flow-based programming desired.
- Experience in Java preferred
- Good to have experience in Snowflake DB
- Bachelor’s Degree or master’s degree in Computer Science, Mathematics, Statistics.
- 5-7 years of real time kafka experience (Preferred).
- 3 years of KSQL and Kafka Connect experience (Preferred).
- 8-10 years of Java experience.
- 2-4 years of NiFi experience.
- 2-4 years Kubernetes/OpenShift.
- ID: #41111096
- State: Texas San antonio 78201 San antonio USA
- City: San antonio
- Salary: Depends on Experience
- Job type: Contract
- Showed: 2022-05-18
- Deadline: 2022-07-14
- Category: Et cetera