About The Position

Tempo is seeking a Senior Analytics Engineer to construct and maintain the data infrastructure that supports GTM and corporate reporting. This role is responsible for the curated datasets, metric definitions, and transformation logic that the business relies on for consistent and trustworthy data. The engineer will collaborate with stakeholders to translate business needs into data models ready for reporting, and increasingly, to make these models accessible to AI-powered tools used for data querying and interpretation. The scope includes revenue and subscription analytics, reconciliation with other systems, and the semantic layer. This position is focused on designing and maintaining the data layer itself – the tables, definitions, and documentation that ensure downstream reporting is reliable, consistent, and self-serve, rather than primarily on dashboard production or ad hoc reporting.

Requirements

  • 4-7 years of experience in analytics engineering, data analytics, or a related role in a SaaS or technology environment
  • Strong SQL skills and hands-on experience with dbt (building models, writing tests, managing documentation)
  • Experience working with subscription or revenue data — comfortable with concepts like ARR, churn, retention, and waterfall reporting
  • Strong judgment on how to structure and organize data for downstream reporting and analysis. You care about naming, grain, consistency, and making things easy to use
  • A natural investigator: when numbers don't match, you dig until you find the root cause rather than patching over the discrepancy
  • Familiarity with BigQuery and BI platforms such as Looker or Omni
  • Familiarity with ETL and reverse ETL tools for moving data between enterprise systems (e.g. Fivetran, Hightouch)
  • Comfortable working cross-functionally and partnering with teammates to clarify requirements and definitions
  • Strong written communication skills with the ability to explain data logic, tradeoffs, and metric definitions clearly to both technical and non-technical audiences
  • Interest in how AI tools are changing the way organizations interact with data, and enthusiasm for building data layers that are legible to both humans and machines
  • Bachelor's degree in a quantitative field such as Statistics, Economics, Computer Science, or similar, or equivalent practical experience

Responsibilities

  • Design, build, and maintain curated datasets in dbt Cloud that power GTM and corporate reporting across the full revenue lifecycle
  • Own metric definitions and ensure consistent logic across recurring reporting, reducing duplicated, ambiguous or conflicting calculations
  • Investigate data discrepancies end-to-end: trace issues across transformation layers, validate outputs against source systems (e.g. Salesforce), and resolve root causes
  • Document core datasets and metric definitions in dbt YAML so that both human users and AI-powered query tools inherit accurate, consistent context
  • Translate complex business logic into clean, well-tested data models
  • Support a limited set of downstream reporting and one-off analytical requests, especially where they help refine or pressure-test the underlying data model
  • Over time, grow into more direct partnership with business stakeholders to improve reporting clarity, usability, and trust

Benefits

  • Remote First work environment
  • Unlimited vacation in most of our locations!!
  • Great benefits including health, dental, vision and savings plan.
  • Perks such as training reimbursement, WFH reimbursement, and more.
  • Diverse and dynamic teams with challenging and exciting work.
  • An opportunity to have a real impact on our business.
  • A great range of social activities (both in person and virtual).
  • Optional in person meet-ups and the ability to travel to our international offices
  • Employee referral program
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service