Data Engineer, Top Secret Clearance

Blue Sky InnovatorsReston, VA

About The Position

Seeking a Lead Data Engineer / Mission Data Pipeline to serve as a Subject Matter Expert (SME). You will work directly with government, technical, and industry stakeholders to design, implement, and sustain data pipelines that ingest, transform, store, and distribute mission-relevant data across the DIFC2 software ecosystem and external mission partner systems. You will support the development and delivery of software capabilities that improve data accessibility, mission integration, operational visibility, and cross-system interoperability across a highly dynamic mission environment. The focus is on enabling reliable, automated, and secure data flows that support mission capabilities on operational timelines. This position requires collaboration with mission system stakeholders, platform engineers, and security teams to define data exchange artifacts, develop pipeline implementations, and ensure interfaces comply with Risk Management Framework (RMF) requirements. The engineer will leverage both low-code/no-code ETL technologies and custom software development to implement scalable data pipelines that support the data architecture.

Requirements

  • 9 years of experience and a Bachelor’s degree in Computer Science, Software Engineering, Management Information Systems/Information Systems, or a related discipline; or a Master's degree and 7 years of experience; or a PhD/JD and 4 years of experience.
  • 7+ years of experience supporting software engineering, systems integration, platform implementation, or application development efforts within DoD, IC, SAP/SCI, or other complex mission environments.
  • Experience designing or implementing data pipelines, ETL workflows, or data integration architectures.
  • Experience working with structured and semi-structured data formats (JSON, CSV, XML, etc.).
  • Requires an active Top Secret clearance with the ability to obtain and maintain Sensitive Compartmented Information and Special Program access, as well as a willingness to consent to a polygraph examination.

Nice To Haves

  • Proficiency with low-code/no-code data orchestration technologies such as Apache NiFi, Apache Airflow, or similar ETL orchestration tools.
  • Experience implementing data transformation logic using Python and data processing libraries such as Pandas and NumPy.
  • Certification or demonstrated experience as a Palantir Foundry Data Engineer, including use of Code Repositories, Pipeline Builder, AIP Logic, and other platform-relevant technologies.
  • Experience designing pipelines that process and fuse large-scale datasets from multiple sources.
  • Familiarity with DoW data architectures, mission system integration, or data interoperability frameworks.
  • Ability to communicate effectively with engineers, architects, operators, and senior Government leadership while translating technical concepts into mission-relevant value.
  • Proven ability to work in highly regulated, fast-moving environments where technical excellence, mission responsiveness, and stakeholder coordination are all critical to success.

Responsibilities

  • Analyze mission workflows and system architecture to identify the data artifacts required to support operational capabilities.
  • Define the structure, semantics, and lifecycle of mission data products within the data architecture.
  • Ensure data artifacts align with architectural standards for interoperability, traceability, and reuse across mission components.
  • Serve as the primary technical liaison for defining data exchange interfaces between mission systems.
  • Collaborate with mission partners to define message schemas, data formats, transport mechanisms, and interface expectations.
  • Document data interface specifications to support both system integration and operational sustainment.
  • Support the RMF process by defining technical details for machine-to-machine data interfaces.
  • Produce or contribute to required RMF artifacts including: Data message descriptions, Ports, Protocols, and Services Management (PPSM) entries, System topology diagrams and interface documentation.
  • Coordinate with cybersecurity engineers to ensure pipeline implementations meet DoW security and compliance requirements.
  • Design and implement automated data pipelines that ingest, transform, persist, and expose data artifacts for use by internal and external stakeholders.
  • Utilize government-provided data platform infrastructure to build scalable ETL workflows.
  • Develop custom transformation logic when required using appropriate programming languages and data processing frameworks.
  • Expose processed data products through standardized service interfaces such as REST APIs or platform-native services.
  • Implement data validation, normalization, and transformation logic to ensure accuracy and usability of integrated datasets.
  • Troubleshoot data pipeline failures and implement monitoring, logging, and recovery mechanisms.
  • Optimize pipeline performance to support mission timelines and large-scale data processing requirements.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service