Senior Data Engineer

Tava HealthSalt Lake City, UT
12h

About The Position

At Tava Health, we believe mental health care should be as accessible and stigma-free as a checkup. We're reimagining the entire experience: from how people find a therapist to how providers deliver care, so more individuals can get the support they need, when they need it. We’re a fast-growing team on a bold mission: to make high-quality mental health care available to everyone. If you’re passionate about using technology to solve meaningful problems and create lasting change, we’d love to meet you. About the Role Tava Health is expanding its data engineering team and is hiring a Senior Data Engineer. This is a hands-on role, responsible for building and owning the company’s data infrastructure from the ground up. You’ll be a go-to person for all things data engineering: managing integrations, building pipelines, and leading centralized data governance. Your work will support teams across the organization in their efforts. This is a high-autonomy, high-impact role in a dynamic, fast-paced environment. Strong time management and prioritization skills are essential.

Requirements

  • 5+ years of data engineering experience in high-growth or complex environments.
  • Proficiency with modern data warehousing tools such as BigQuery or Snowflake.
  • Strong SQL and data modeling skills.
  • Experience writing production-level code in a general-purpose language.
  • Integration experience with tools like Salesforce, HubSpot, or Iterable.
  • Outstanding time and backlog management skills; able to manage competing priorities in a fast-paced environment.
  • Self-starter who works well independently and cross-functionally with both technical and non-technical stakeholders.

Nice To Haves

  • Experience with pub/sub architectures for real-time or event-driven data systems
  • Familiarity with Fivetran, Metabase, and Airbyte
  • Proficiency in TypeScript/Node.js
  • Background in health tech or experience working with sensitive data

Responsibilities

  • Build and maintain robust data pipelines using tools like Airflow, Airbyte, and BigQuery.
  • Manage and evolve our data warehouse architecture.
  • Integrate third-party platforms including Salesforce, HubSpot, Iterable, ZocDoc, and Metabase.
  • Lead the implementation of centralized data governance, defining sources of truth and managing data flow standards.
  • Partner with cross-functional teams to implement clean, reporting-ready data models in dbt.
  • Write clean, production-grade code in a general-purpose language (e.g., Python).
  • Monitor data quality and ensure reliability across systems.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service