There are still lots of open positions. Let's find the one that's right for you.
The position requires a strong background in various technologies including Hadoop, Spark, Cloud, Python/Scala, Java, Streaming, and Kafka. The candidate should have a proven track record in coding with at least one programming language such as Scala or Python. Experience with cloud computing platforms like GCP or Azure is essential, along with skills in data modeling and data migration protocols. Familiarity with data warehousing and BI is preferred, as well as experience with integration tools like Automic and Airflow. The role emphasizes high standards of code quality, system reliability, and performance.