Virtasant-posted 6 days ago
Full-time • Mid Level
Remote
51-100 employees

We’re looking for a Senior Data Engineer to join us and work with our client's Data Platform team. Our client is a leading healthcare technology company, dedicated to transforming the patient and provider experience through innovative, data-driven solutions. You will architect and build core services, automation tools, and integrations that power our client's data ecosystem. You’ll own high-impact platform components, improve pipeline reliability and observability, and partner closely with data engineering, analytics, and DevOps to advance the scalability and developer experience of our client's data platform.

  • Build Automation & Tooling: Develop scalable backend services, APIs, and internal tools to automate data platform workflows (e.g., data onboarding, validation, pipeline orchestration, schema tracking, quality monitoring).
  • Data Platform Integration: Integrate tools with core data infrastructure, building pipelines (Airflow, Spark, dbt, Kafka, Snowflake, or similar) to expose capabilities via APIs and UIs.
  • Observability & Governance: Build visualization and monitoring components for data lineage, job health, and quality metrics.
  • Collaboration: Work cross-functionally with data engineering, product, and DevOps teams to define requirements and deliver end-to-end solutions.
  • 7+ years of experience in data engineering or software development with at least 5 years building production-grade data or platform services
  • Strong programming skills in Python & SQL on at least one major data platform (Snowflake, BigQuery, Redshift, or similar)
  • Develop tooling for schema evolution, data contracts, and developer self-service
  • Deep experience with streaming, distributed compute, or S3-based table formats (Spark, Kafka, Iceberg/Delta/Hudi).
  • Experience with schema governance, metadata systems, and data quality frameworks.
  • Understanding of orchestration tools (Airflow, Dagster, Prefect, etc.)
  • Solid grasp of CI/CD and Docker
  • At least 2 years of experience in AWS
  • Experience with building data pipelines using DBT
  • Experience with data observability, data catalog, or metadata management tools
  • Experience working with healthcare data (X12, FHIR)
  • Proven experience in data migration projects (legacy technologies to the latest technologies)
  • Experience building internal developer platforms or data portals
  • Understanding of authentication/authorization (OAuth2, JWT, SSO)
  • Totally remote within the contiguous United States, full-time (40h/week)
  • Stable, long-term indepedent contract agreement
  • Work hours - US Eastern time office hours
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service