Vacancy expired!
- Good experience with data bricks.
- Experience with Java OR Golang.
- 5+ years of experience with Snowflake Data warehousing concepts.
- 5+ years of experience designing, building, deploying, testing, maintaining, monitoring and owning scalable, resilient and distributed data pipelines.
- High Proficiency in at least two of Scala, Python, Spark or Flink applied to large scale data sets.
- Strong understanding of workflow management platforms (Airflow or similar).
- Familiarity with advanced SQL.
- Expertise with big data technologies (Spark, Flink, Data Lake, Presto, Hive, Apache Beam, NoSQL, ).
- Knowledge of batch and streaming data processing techniques.
- Obsession for service observability, instrumentation, monitoring and alerting.