Staff Data Engineer

Interwell Health
3hRemote

About The Position

Interwell Health is a kidney care management company that partners with physicians on its mission to reimagine healthcare—with the expertise, scale, compassion, and vision to set the standard for the industry and help patients live their best lives. We are on a mission to help people and we know the work we do changes their lives. If there is a better way, we will create it. So, if our mission speaks to you, join us! Reporting to the Director of Data Engineering, the Staff Data Engineer serves as a senior technical leader responsible for shaping, scaling, and governing our modern data ecosystem. This role blends architecture, hands-on engineering, platform leadership, and cross functional partnerships to deliver high quality data products that power clinical, operational, financial, and analytical outcomes. Deep experience with Databricks, Python, dbt, and Microsoft Fabric, along with strong fluency in healthcare data and compliance standards, is essential. At its core, you’ll work closely with teams across the organization to deliver governed, high‑quality, analytics‑ready data at scale. Our Tech Stack: Databricks, Delta Lake, Unity Catalog, Microsoft Fabric (OneLake, Lakehouse, Data Factory), Azure, dbt, Python, PySpark, Spark SQL.

Requirements

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 7+ years of experience in data engineering.
  • 2+ years operating in a senior or staff level engineering role.
  • Deep hands-on proficiency with Databricks, Spark, Delta Lake, dbt, and Python.
  • Proven ability to design and operate largescale cloud data platforms (Azure preferred).on experience with
  • Hands on experience with Data Engineering, Data Factory, Lakehouse, OneLake.
  • Advanced data platform architecture and Lakehouse design expertise.
  • Demonstrated ability to design modular, extensible frameworks and guide the long-term evolution of enterprise data platforms.
  • Strong command of distributed data processing and cloud native engineering.
  • Experience working in HIPAA regulated environments and handling PHI.
  • Healthcare data fluency, including regulated data handling and compliance.
  • Technical leadership, mentorship, and influence across teams.
  • Strong communication skills with both technical and clinical stakeholders.
  • Experience with platform reliability, CI/CD for data pipelines, and infrastructure as code.
  • 100% remote (ET or CT work hours preferred)

Nice To Haves

  • Experienced in implementing and supporting Epic integrations, leveraging Cogito Cloud and Caboodle data models, and delivering reliable incremental data pipelines from Caboodle/Clarity.

Responsibilities

  • Architecture & Strategy
  • Design and evolve a scalable, secure, cloud‑native lakehouse platform leveraging Databricks, Microsoft Fabric (OneLake, Lakehouse, Data Factory), and dbt.
  • Define modeling patterns, governance frameworks, and engineering best practices across the data lifecycle.
  • Lead design reviews and guide teams in adopting scalable architectural patterns.
  • Drive long‑term platform strategy and evaluate emerging technologies.
  • Hands-on Engineering
  • Design and implement batch and streaming data pipelines for healthcare data sources (EHR, claims, HL7/FHIR, APIs, flat files, databases)
  • Develop modular ingestion, quality, lineage, metadata, and observability frameworks that scale across domains.
  • Produce clean, analytics‑ready datasets and data models for BI, analytics, and machine learning workloads.
  • Implement HIPAA‑aligned access patterns and secure handling of PHI.
  • Architect Databricks workloads (clusters, jobs, Unity Catalog, Delta Lake) for reliability, performance, and cost efficiency.
  • Integrate Databricks and Microsoft Fabric with Azure services and enterprise systems.
  • Leadership & Collaboration
  • Partner with product managers, data scientists, analysts, clinicians, and business stakeholders to translate healthcare data needs into scalable solutions.
  • Lead Cross functional initiatives that modernize and unify the organization’s data ecosystem.
  • Mentor senior and mid-level engineers; elevate team capability through technical coaching and standards.
  • Drive roadmap planning, platform evolution, and long-term data strategy.
  • Champion engineering excellence, reliability practices, documentation quality, and governance.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service