About The Position

We’re looking for an experienced Senior Data Engineer to help us change how the world works. Here, you’ll be part of our Data Engineering & Analytics team, supporting cross-functional groups around the world. The right candidate will develop, review, and maintain data infrastructure and various data flows. You will also develop means to ensure continuous data validation and telemetry for all data processes. This is an exciting time at Alludo, with new leadership, a refreshed brand, and a whole new approach to changing the way the world works. We want you to feel safe to be who you are, take risks, and show us what you’ve got. At Alludo, we’re serious about empowering people to work when, how, and where they want.

Requirements

  • Expert knowledge of Python
  • Expert knowledge of SQL
  • Experience with DevOps mode of work
  • 7+ years of professional experience
  • 5+ years of experience working in data engineering, business intelligence, or a similar role
  • 5+ years of experience in ETL orchestration and workflow management tools like Airflow, flink, etc. using AWS/GCP
  • 3+ years of experience with the distributed data processing tools like Spark, Presto, etc. and streaming technologies such as Kafka/Flink
  • 3+ years of experience with SnowFlake (preferred) or another big data database platform
  • 3+ years of experience with cloud service providers: Amazon AWS (preferred), or one of the other major clouds
  • Expertise with containerization orchestration engines (Kubernetes)
  • MS in Computer Science, Software Engineering, or relevant field preferred, BS in one of same fields acceptable

Responsibilities

  • Design, develop and implement large scale, high-volume, high-performance data infrastructure and pipelines for Data Lake and Data Warehouse
  • Build and implement ETL frameworks to improve code quality and reliability
  • Build and enforce common design patterns to increase code maintainability
  • Ensure accuracy and consistency of data processing, results, and reporting
  • Design cloud-native data pipelines, automation routines, and database schemas that can be leveraged to do predictive and prescriptive machine learning
  • Communicate ideas clearly, both verbally and through concise documentation, to various business sponsors, business analysts and technical resources
  • Guide and mentor other Data Engineers as a technical owner of parts of the data platform

Benefits

  • Fully remote workspace
  • Flexible hours
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service