Data Integration Engineer

Cherokee Federal
5hRemote

About The Position

As required by our governmental client, this position requires being a US Citizen. ATA LLC is seeking a Data Integration Engineer to support healthcare data integration efforts within an Azure‑based data platform. This role is hands‑on and delivery‑focused, with a strong emphasis on Python‑driven data pipelines, Azure Synapse, and healthcare interoperability (FHIR/HL7). A core expectation of this role is the ability to design, test, and validate data pipelines in environments where upstream specifications may be incomplete or inconsistent. The ideal candidate brings strong technical judgment, repeatable testing strategies, and the ability to raise data quality standards across the team. Location: “Work from Anywhere” in the Continental United States with the ability to travel to the Greater Washington D.C. Metropolitan Area or to a client location from time to time. Our preference is for remote personnel in one of the following locations: Greater Metro Washington DC area and Huntsville, AL.

Requirements

  • This is a Health IT opportunity and previous experience working in Health IT and with healthcare data and data standards is required.
  • Bachelor’s degree in Computer Science, Information Systems, Engineering, or equivalent experience.
  • 2–5 years of experience in data integration or data engineering roles.
  • Hands-on experience with HL7, FHIR, X12, or similar healthcare data formats.
  • Proficiency with Git for version control and collaborative development.
  • Experience using Terraform to deploy or manage cloud infrastructure.
  • General knowledge of cloud environments (Azure, AWS, or GCP).
  • Working knowledge of Azure Synapse or similar cloud data platforms.
  • Experience working with Parquet file formats in data engineering workflows.
  • Strong SQL and/or Python skills for data manipulation and validation.

Nice To Haves

  • A positive, willing attitude
  • Self-motivated ability to make and meet commitments.
  • An ability to think on your feet and solve problems quickly.
  • Can learn new subject areas on the fly.
  • Enjoys working in a cross-disciplinary team environment.
  • Technology agnostic with the ability to apply the right tool to the requirement.

Responsibilities

  • Design, build, and maintain data pipelines in Azure Synapse using Python.
  • Implement and operate a medallion data architecture (Bronze, Silver, Gold layers).
  • Ingest, transform, and publish data in CSV, Parquet, and XML formats.
  • Perform complex data mapping and transformation across healthcare data sources.
  • Work directly with HL7 and FHIR healthcare data standards.
  • Define and execute data pipeline testing strategies, including: Validation of transformations and mappings Data completeness, accuracy, and consistency checks Repeatable, team‑adoptable testing approaches
  • Operate effectively in situations where test cases or specs are not clearly provided, helping establish defensible validation criteria.
  • Serve as a technical lead, setting patterns and best practices the broader team can follow.
  • Collaborate closely with engineering, QA, and stakeholders to improve data quality and delivery outcomes.

Benefits

  • generous paid time-off
  • an employee incentive program
  • continuous learning culture
  • Internal Investment Projects (IIP)
  • virtual brown-bags/level-ups, and other professional development activities
  • recruiting bonuses
  • 3% 401k Safe Harbor contributions
  • Medical/Dental/Vision
  • Long & Short-term Disability
  • AD&D insurance
  • Life Insurance
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service