Senior Data Engineer

Apricot HealthOklahoma City, OK
5h

About The Position

At Apricot, we believe that nurses are the unsung heroes of healthcare, and they deserve to be treated as such. Our goal is to empower nurses by freeing them from the shackles of paperwork, allowing them to focus on what they became nurses to do: take care of patients. Role Overview: We are looking for a Senior Data Engineer to own Apricot’s analytics foundation end-to-end. This role is responsible for designing, building, and evolving our data pipelines, medallion architecture (Bronze → Silver → Gold), and analytics models as our application, customers, and data volume scale. You’ll turn raw PostgreSQL application data into trusted, analytics-ready datasets in BigQuery that power: Omni Topics and semantic models Dashboards used across Product, CX, Ops, and Leadership Embedded analytics surfaced to customers This is not a back-office reporting role. You will be a builder, architect, and thought partner—working closely with Engineering and Product to ensure our data stays robust as schemas evolve and the product grows.

Requirements

  • Strong experience building analytics data pipelines (ELT) in a modern warehouse
  • Deep SQL expertise (BigQuery preferred): complex joins, window functions, performance tuning
  • Proven experience designing analytics-friendly data models
  • Hands-on ownership of production data workflows—not just analysis
  • Comfort working in ambiguity and making pragmatic architectural decisions
  • Experience defining metrics and maintaining a shared source of truth
  • Ability to design semantic layers and dashboards that real teams actually use
  • Strong instincts around data usability, not just correctness
  • Ability to explain tradeoffs and data meaning to non-technical stakeholders
  • Experience with BigQuery (or similar columnar warehouse)
  • Familiarity with dbt or equivalent transformation/testing patterns
  • Experience managing schema evolution and downstream impacts
  • Practical understanding of access controls and sensitive data handling

Nice To Haves

  • Omni experience (or similar semantic BI tools)
  • Python for data tooling, automation, or validation
  • Experience with embedded analytics
  • Familiarity with medallion or layered warehouse architectures
  • Healthcare or regulated-data experience

Responsibilities

  • Own the data platform & pipelines
  • Design and maintain ELT pipelines from PostgreSQL → BigQuery
  • Build and evolve a medallion architecture that is resilient to schema and product changes
  • Materialize clean, well-documented Silver and Gold tables for analytics use
  • Manage scheduling, freshness, and reliability of data transformations
  • Data modeling & analytics enablement
  • Define analytics-ready models that balance correctness, usability, and performance
  • Create and maintain semantic models / Topics in Omni
  • Ensure metric definitions are consistent, well-documented, and trustworthy
  • Partner with stakeholders to translate questions into durable data models
  • Dashboards & self-serve analytics
  • Build and maintain dashboards used by teams across the company
  • Support embedded analytics use cases for customers
  • Design dashboards that emphasize clarity, sensible defaults, and drill-downs
  • Enable teams to answer their own questions without constant rework
  • Data quality, governance, and scale
  • Implement guardrails for data quality, freshness, and definition drift
  • Monitor downstream impact of application schema changes
  • Apply PHI-safe practices and org-level access controls
  • Optimize BigQuery performance and cost (partitioning, clustering, materialization strategy)
  • Cross-functional collaboration
  • Work closely with Product and Engineering as the application evolves
  • Help shape analytics strategy—not just execute tickets
  • Serve as a thought partner on how data should be structured to support future use cases
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service