Airflow Data Architect

Booz Allen HamiltonMcLean, VA
5d$62,000 - $141,000

About The Position

Airflow Data Architect The Opportunity: Your combination of systems thinking and technical expertise makes you the go-to architect for complex data platforms. As a Data Architect with deep experience in Apache Airflow, Kubernetes, and containerized environments, you know how to design scalable, resilient data orchestration platforms. At Booz Allen, you’ll use those skills to modernize and standardize data workflow orchestration supporting mission-critical federal systems. On our team, you’ll architect and govern the use of Apache Airflow as an enterprise data orchestration platform, supporting multiple agile data engineering teams. You’ll define architecture patterns, deployment models, and operational standards for Airflow running in Kubernetes-based, containerized environments. You’ll guide teams on DAG design, dependency management, scheduling strategies, and CI/CD practices, while ensuring security, reliability, and scalability across environments. You’ll work closely with platform engineers, data engineers, and cloud teams to ensure Airflow integrates cleanly with broader data and cloud architectures. In this role, you’ll directly impact federal mission operations by enabling reliable, automated data pipelines that support time-sensitive and operational decision-making. You’ll help teams move faster and safer by providing clear architectural direction, reusable patterns, and guardrails. With opportunities to mentor engineers and shape the platform roadmap, you’ll play a key role in delivering resilient data solutions at scale. Due to the nature of work performed within this facility, U.S. citizenship is required. Work with us as we improve data orchestration platforms to change federal mission systems for the better. Join us. The world can’t wait.

Requirements

  • 5+ years of experience in data engineering or application architecture, including enterprise-scale data platforms
  • Knowledge of Apache Airflow architecture, including DAG design patterns, scheduling, dependency management, and operational best practices
  • Ability to architect and support Apache Airflow deployments in containerized, Kubernetes-based environments
  • Ability to define and enforce architecture standards for workflow orchestration, CI/CD, and environment separation, including development, testing, and production
  • Ability to obtain and maintain a Public Trust or Suitability/Fitness determination based on client requirements
  • Bachelor’s degree

Nice To Haves

  • Experience with development of Airflow DAGs in Python to prototype solutions and review team implementations
  • Experience supporting Airflow integrations with distributed compute frameworks, such as Spark, Databricks, or Kafka
  • Knowledge of Kubernetes-native tooling, including Helm, container security, and resource optimization
  • Knowledge of Cloud platforms, such as AWS, Azure, or GCP, and managed Kubernetes services
  • Ability to design reusable Airflow components, including custom operators, sensors, hooks, and plugins
  • Master’s degree in Computer Science or Engineering
  • CKA, CKAD, or Cloud Architecture Certification

Benefits

  • health, life, disability, financial, and retirement benefits, as well as paid leave, professional development, tuition assistance, work-life programs, and dependent care
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service