Senior Data Engineer

9amHealth
Remote

About The Position

9amHealth is seeking a Senior Data Engineer to join their Data & Analytics team. This role involves owning and evolving the data platform that supports clinical operations, business intelligence, and AI initiatives. It's a hands-on, full-stack data role requiring strong software engineering skills with a specialization in data. The position involves building and maintaining pipelines, designing analytical data models, contributing to backend application code, and adopting AI-assisted development practices. The ideal candidate will be proficient in Python and SQL, comfortable with AWS data services, and thrive in a fast-paced startup environment.

Requirements

  • 10+ years of professional experience in data engineering, analytics engineering, or a hybrid data/backend software engineering role.
  • Strong software engineering background: this role requires someone who can write, test, debug, and ship production code, not just query data.
  • Expert-level Python: deep experience building production data pipelines, ETL logic, and reporting systems in Python.
  • Expert-level SQL: window functions, CTEs, recursive queries, query optimization, and performance tuning at scale.
  • Hands-on experience with AWS data services, specifically Glue, S3, Redshift, Athena, CloudFormation, CloudWatch, and IAM.
  • Experience with MySQL/Aurora in a production environment.
  • Hands-on experience building and operating data pipelines with AWS Glue, Spark, dbt, Airflow, or comparable frameworks.
  • Deep experience with at least one modern BI platform. Looker (LookML) strongly preferred; Tableau also valued. Should include semantic modeling, dashboard design, and self-service enablement.
  • Solid understanding of data modeling techniques: star/snowflake schemas, slowly changing dimensions, event-based models.
  • Familiarity with AI-assisted coding tools (GitHub Copilot, Claude Code, Cursor, Cody) and a demonstrated interest in integrating AI into engineering workflows.

Nice To Haves

  • Proficiency in Java (Spring Boot, Maven/Gradle) with experience shipping backend services or data-intensive applications to production.
  • AWS certifications (e.g., Solutions Architect, Data Analytics Specialty, or Database Specialty).
  • Experience in health tech, digital health, or regulated industries (HIPAA familiarity is a plus).
  • Experience with CI/CD for data assets (dbt CI, Great Expectations, or similar).
  • Background in building or contributing to AI/ML features: feature stores, training pipelines, model serving, or RAG architectures.
  • Comfort with infrastructure-as-code (Terraform, CloudFormation) and containerized deployments (Docker, ECS/EKS).
  • Prior experience in a startup or high-growth environment where you owned outcomes end to end.
  • Track record of improving developer experience and productivity through tooling, automation, or process improvements.

Responsibilities

  • Design, build, and maintain scalable data pipelines and ETL/ELT workflows in Python using AWS Glue, Apache Spark, Airflow, or equivalent orchestration tools.
  • Write production-grade Python code for DWH logic, reporting jobs, data transformations, and internal tooling, following software engineering best practices (testing, code review, CI/CD).
  • Develop and optimize analytical data models (dimensional, OBT, or hybrid) that serve self-service BI and advanced analytics use cases.
  • Build and maintain dashboards, explores, and semantic layers in Looker and/or Tableau; serve as the analytics infrastructure owner ensuring data quality and governance.
  • Contribute backend application code in Java (Spring Boot) or Python to support data-intensive features, API integrations, and internal services.
  • Champion modern AI coding practices across the data team, leveraging tools like GitHub Copilot, Claude, Cursor, or similar AI-assisted development environments to accelerate delivery and code quality.
  • Author and maintain comprehensive SQL assets (stored procedures, views, complex queries) across Redshift, Aurora/MySQL, and Athena.
  • Operate and optimize AWS data infrastructure including Glue, S3, Redshift, CloudFormation, CloudWatch, IAM, and Athena.
  • Collaborate closely with clinical operations, product, finance, and engineering teams to translate business questions into reliable, well-documented data products.
  • Implement data quality frameworks, monitoring, alerting, and incident response processes for the data platform.
  • Contribute to the architecture and data strategy for AI/ML features, including data prep, feature engineering, and model monitoring.
  • Mentor the existing data analyst and data engineer; help establish team standards, code review practices, and documentation norms.

Benefits

  • health, dental, and vision insurance
  • flexible PTO
  • work from home options
  • professional development budget
  • support continuing education
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service