Data Lead Architect

A & T SYSTEMS INCBaltimore, MD

About The Position

The Data Lead Architect will provide expertise in current principles and practices of architecture, data management systems, and large system designs, including Enterprise Data Warehouse (EDW) and cloud-based Data Lake solutions. The role requires experience in data modeling, including Medallion Architecture and Dimensional Data Modeling (e.g., Star and Snowflake schemas), knowledge of healthcare data standards (EDI X12, FHIR, HL7), and expertise in Amazon Web Services (AWS). This position focuses on transitioning legacy systems to modern ones by managing information flows, data exchange, and standardization services. It involves leading a team in designing, building, and maintaining data pipelines, implementing OLAP solutions with Iceberg or Delta Lake, and ensuring HIPAA and CMS compliance.

Requirements

  • Bachelor’s Degree from an accredited college or university with a major in Engineering, Computer Science, Mathematics or a related field.
  • At least ten (10) years of experience planning, designing, building, and implementing IT systems.
  • At least seven (7) years of experience in data engineering, ETL development, or cloud data architecture.
  • At least two (2) years in a technical leadership role.
  • At least five (5) years of experience in the direct supervision and management of major projects involving professional support services and/or the integration, implementation, and transition of large complex system and subsystem architectures.
  • Experience as a lead or chief architect for large-scale IT implementation efforts, particularly in healthcare data environments.
  • Broad understanding of client IT environmental issues and solutions.
  • Recognized expert within the IT industry.
  • Advanced abilities to team and mentor.
  • Demonstrated excellence in written and verbal communication skills.
  • Architecting and implementing cloud-based EDW and Data Lakes using Medallion Architecture.
  • Proficiency with AWS services such as Glue, Redshift, Athena, EMR, S3, HealthLake, and Lake Formation.
  • Development and optimization of scalable ETL/ELT workflows using Apache Spark, Airflow, or similar tools.
  • Hands-on experience with EDI X12 transactions (e.g., 837, 835, 278) and healthcare data integration.
  • Implementation of data quality, lineage, security, and governance controls in compliance with HIPAA and CMS regulations.
  • Strong programming skills in SQL, Python, or Scala.

Nice To Haves

  • Master’s degree preferred.

Responsibilities

  • Lead and mentor a team of data engineers in developing and maintaining scalable, automated data pipelines.
  • Architect and implement a Medallion Architecture-based Data Lakehouse with Bronze (Raw), Silver (Cleansed), and Gold (Aggregated) layers.
  • Design and develop an EDW on cloud platforms (e.g., AWS) using technologies such as Apache Iceberg or Delta Lake.
  • Build and optimize Dimensional Data Models using Star Schema and Snowflake Schema.
  • Develop and optimize ETL/ELT workflows using tools like AWS Glue, Apache Spark, Databricks, and Airflow.
  • Work with AWS services (e.g., Glue, Redshift, Athena, EMR, HealthLake, Lake Formation) or equivalent platforms.
  • Process and integrate healthcare data in EDI X12 formats (837, 835, 278, etc.) and map to FHIR and HL7 standards.
  • Ensure compliance with HIPAA and CMS interoperability standards through data governance, lineage, access controls, and encryption.
  • Collaborate with business and technical stakeholders to align data architecture with organizational goals.
  • Drive the adoption of CI/CD, automation, and infrastructure-as-code practices in data engineering workflows.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service