About The Position

Key Responsibilities: Design, develop, and manage complex Airflow DAGs for data workflows, ETL/ELT pipelines, and task automation. Integrate Airflow with various data sources (e.g., databases, APIs, cloud storage) and targets. Optimize and monitor pipeline performance and troubleshoot failures in a timely manner. Collaborate with data engineers, analysts, and platform teams to support end-to-end data solutions. Implement best practices for code quality, testing, version control, and deployment. Maintain documentation and provide technical mentorship to junior engineers.

Requirements

  • 3+ years of hands-on experience with Apache Airflow in production environments.
  • Strong programming skills in Python.
  • Solid understanding of data engineering concepts, including ETL/ELT processes, workflow orchestration, and data validation.
  • Experience with SQL and relational databases (e.g., PostgreSQL, MySQL).
  • Familiarity with cloud platforms (e.g., AWS, GCP, Azure) and their storage and compute services.

Nice To Haves

  • Knowledge of containerization tools like Docker and orchestration with Kubernetes (a plus).

Responsibilities

  • Design, develop, and manage complex Airflow DAGs for data workflows, ETL/ELT pipelines, and task automation.
  • Integrate Airflow with various data sources (e.g., databases, APIs, cloud storage) and targets.
  • Optimize and monitor pipeline performance and troubleshoot failures in a timely manner.
  • Collaborate with data engineers, analysts, and platform teams to support end-to-end data solutions.
  • Implement best practices for code quality, testing, version control, and deployment.
  • Maintain documentation and provide technical mentorship to junior engineers.

Benefits

  • Medical, vision, and dental benefits
  • 401k retirement plan
  • variable pay/incentives
  • paid time off
  • paid holidays
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service