DATE3 - Data Engineer - Level III

ATTAINX INCRichland, WA
Remote

About The Position

We are in search of a highly collaborative and experienced Senior Data Analyst/Data Engineer to support the Office of Chief Data Officer (OCDO) and the Office of Performance Quality (OPQ) for a federal government contract. In this role, you will design and maintain robust data pipelines, perform in-depth analysis of large-scale datasets and deliver actionable insights that drive mission decisions. You will work within a Databricks environment leveraging SQL, PySpark and Python, to transform raw agency data into reliable, governed and analytics-ready assets. The ideal candidate combines strong engineering fundamentals with analytical acumen and is comfortable operating within complex federal data environments.

Requirements

  • Active DHS or public trust clearance
  • U.S. citizen
  • Bachelor’s degree in computer science, information systems, data science, engineering, mathematics or a related technical field
  • 10+ years of experience in data engineering, analytics engineering or large-scale data platform development
  • Demonstrated experience supporting federal government programs, preferably within DHS or similar high-security environments
  • Expert-level proficiency in SQL, including advanced query tuning, distributed query optimization and large dataset performance engineering
  • Extensive hands-on experience with PySpark, distributed data processing and performance optimization techniques
  • Strong proficiency in Python for automation, data manipulation and pipeline orchestration
  • Deep experience with Databricks, including clusters, jobs, notebooks, Unity Catalog and Delta Lake
  • Strong understanding of lakehouse architecture, medallion design patterns and enterprise data governance
  • Experience with Git-based development workflows, code review processes and CI/CD integration for data engineering

Nice To Haves

  • Databricks Certified Data Engineer Professional or equivalent senior-level certification
  • Experience architecting solutions AWS GovCloud, Microsoft Azure Government or Google Cloud for Government
  • Hands-on experience implementing CI/CD for data pipelines, including automated testing, quality gates and deployment pipelines (Azure DevOps, GitHub Actions)
  • Strong experience with enterprise BI platforms (Tableau, Power BI) and integrating them with Databricks SQL endpoints
  • Experience influencing data strategy, platform modernization or cloud migration initiatives

Responsibilities

  • Architect, lead and optimize large-scale ETL/ELT pipelines using PySpark, Python, Spark SQL and Databricks to support advanced analytics, fraud detection and inter-agency data sharing.
  • Implement enterprise-grade monitoring, observability and data quality frameworks, including automated validation, anomaly detection and SLA-driven alerting.
  • Develop and tune review complex SQL for large datasets, distributed compute environments and mission-critical analytical workloads.
  • Build automated ingestion frameworks for APIs, flat files, RDBMS sources and streaming feeds, ensuring lineage, auditability and compliance with Department of Homeland Security (DHS) and federal data governance standards.
  • Design and govern data lakehouse architectures, including Delta Lake, medallion (bronze/silver/gold) patterns, schema evolution and performance optimization.
  • Lead exploratory data analysis to identify operational trends, anomalies and mission-impacting insights across large, federated datasets.
  • Build and maintain analytics products, dashboards and reporting solutions using Databricks SQL, Tableau, Power BI and SSRS.
  • Support machine learning and entity resolution initiatives through feature engineering, data preparation and scalable data services.
  • Translate ambiguous business problems into structured analytical frameworks, delivering executive-ready visualizations, technical briefs and data-driven recommendations.
  • Serve as a technical subject matter expert (SME) and engineering lead across multi-disciplinary teams including data scientists, program analysts, cybersecurity and IT operations.
  • Provide mentorship, code reviews and architectural guidance to engineering teams across multiple programs.
  • Lead requirements analysis, system design, integration and modernization efforts for enterprise systems supporting operations and high-security missions.
  • Produce comprehensive documentation for pipelines, data models, governance processes and analytical notebooks to support audits, continuity and knowledge transfer.
  • Drive Agile delivery, backlog prioritization and iterative delivery of high-impact data capabilities aligned with agency mission objectives.

Benefits

  • Competitive compensation and benefits packages including paid vacation, paid holidays and sick pay
  • Medical, dental and vision plans plus HSA and FSA accounts
  • Matching 401K plan
  • Tuition, training, certification, professional development programs
  • Long and short-term disability
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service