Vacancy expired!
- 5+ years of application development and implementation experience
- 5+ years of experience delivering complex enterprise wide information technology solutions
- 5+ years of ETL (Extract, Transform, Load) Programming experience
- 3+ years of reporting experience, analytics experience or a combination of both
- 4+ years of Hadoop development/programming experience
- 5+ years of operational risk or credit risk or compliance domain experience
- 5+ years of experience delivering ETL, data warehouse and data analytics capabilities on big-data architecture such as Hadoop
- 6+ years of Java or Python experience
- 5+ years of Agile experience
- 5+ years of design and development experience with columnar databases using Parquet or ORC file formats on Hadoop
- 5+ years of Apache Spark design and development experience using Scala, Java, Python or Data Frames with Resilient Distributed Datasets (RDDs)
- 2+ years of experience integrating with RESTful API
- Experience designing and developing data analytics solutions using object data stores such as S3
- Experience in Hadoop ecosystem tools for real-time & batch data ingestion, processing and provisioning such as Apache Spark and Apache Sqoop
- Ability to work effectively in virtual environment where key team members and partners are in various time zones and locations
- Knowledge and understanding of project management methodologies: used in waterfall or Agile development projects
- Knowledge and understanding of DevOps principles
- ID: #43713846
- State: North Carolina Charlotte 28201 Charlotte USA
- City: Charlotte
- Salary: Depends on Experience
- Job type: Contract
- Showed: 2022-06-30
- Deadline: 2022-08-28
- Category: Et cetera