Data Engineer

Pivotal HealthLos Angeles, CA
Hybrid

About The Position

Pivotal Health is seeking a Data Engineer to join their team, sitting at the intersection of analytics and engineering. The role involves making product data accessible, reliable, and ready for analysis by connecting data sources to the data warehouse, building clean transformation pipelines, and ensuring analysts have the necessary data for business decisions. This position requires a strong technical foundation applied towards business outcomes such as faster reporting, improved data access, and trustworthy pipelines. The ideal candidate enjoys building infrastructure for analysis and is motivated by the business impact of their work.

Requirements

  • Strong SQL skills with hands-on experience in modern cloud data warehouses: BigQuery, Snowflake, or Redshift
  • Proficient with dbt for managing SQL transformations. You understand how to write clean, maintainable, well-documented models
  • Comfortable with Python at a working level, enough to build and automate data workflows without needing to be a full software engineer
  • Experience with at least one BI or reporting tool (Tableau, Power BI, Metabase, or similar)
  • You think in business outcomes: your resume reflects the impact your work had, not just the tools you used
  • Self-directed and comfortable with ambiguity: you can identify what needs to be done and execute without heavy guidance
  • Collaborative by nature: you know how to work across teams with different levels of technical depth
  • Startup or high-growth company experience: you're used to environments where ownership is real and speed matters

Nice To Haves

  • Hands-on experience with BigQuery specifically
  • Experience connecting BI tools to a cloud warehouse (e.g., Power BI to BigQuery)
  • Experience with Salesforce data or CRM integrations
  • Background in FinTech, HealthTech, or other data-rich industries

Responsibilities

  • Own the pipeline from product database to analytics warehouse: Take full ownership of extracting data from our PostgreSQL product database and loading it into BigQuery. Design and maintain the ETL processes that make this happen reliably, with the right structure for downstream analytics use.
  • Bring in new data sources: Expand our analytics footprint by integrating new data sources, including third-party tools like Salesforce, into our warehouse. You'll partner with our DevOps team to establish the right service accounts, permissions, and connection patterns to do this securely and correctly.
  • Build and maintain analytics-ready tables: Use dbt to design, build, and manage the transformation layer that turns raw data into clean, well-structured tables. You'll have real ownership over what the data looks like: what gets modeled, how it's shaped, and what makes it most useful for reporting.
  • Support reporting and business insights: Work alongside our analysts to support the reporting layer, ensuring data is fresh, accurate, and structured in a way that makes building dashboards and reports in Tableau, Power BI, or Metabase reliable and efficient.
  • Be the bridge between analytics and engineering: Attend engineering team meetings to stay ahead of product changes that could affect analytics. Serve as the connective tissue between both teams, translating data needs into technical solutions and keeping everyone aligned.

Benefits

  • Competitive compensation, including equity
  • Full health, dental, and vision coverage
  • Retirement savings plan through 401(k)
  • Flexible time off
  • Opportunities for company-wide connection and events

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

1-10 employees

© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service