About The Position

Data is at the heart of every decision made at Klaviyo, and we’re looking for a Business Intelligence Data Engineer to join our Go To Market (GTM) team supporting Compensation Analytics. This domain of data aims to improve the experience of all Klaviyos variable compensation plans. Secondary to this, the role will support ancillary functions of the Professional Services organization. This role sits in Data Engineering as part of the GTM team, which is part of a hub and spoke model of analytics engineering at Klaviyo. You’ll build and steward the source of truth for Compensation data so People leaders and analysts can answer compensation questions quickly and confidently and turn those insights into a more incentivized, higher-performing organization. You will directly support the compensation analytics teams at Klaviyo, working cross functionally with Systems, Payroll, Audits, Planning, and People Operations. You will be an independent self-serving, embedded partner to all of GTM leadership where variable compensation plans exist, capable of translating ambiguous requirements into stable data products. You’ll be supported by the broader Data Engineering organization’s standards, tooling, and review practices.. How you’ll make a difference Deliver compensation‑data SSoT that drives better employee experience, is SOX compliant, and entirely auditable by both internal and 3rd party organizations. Maintain and sStand up curated, documented marts that make it easy to monitor attainment health, quota setting, audits, and cycles so that leadership can minimize time in front of compensation boards to focus on their organizations. Own the pipelines & models end‑to-end. Build and maintain reliable integrations from core compensation systems (e.g., SPM/ICM/CRM), model them in dbt, and publish governed marts and reverse‑ETLs to operational destinations where they create value. Create attainment views with Compensation and People Analytics. Partner with analysts to build quota→BoB management→attainment→submission→booking lifecycle views of quarterly compensation. Focusing on quicker cadences to booking and dynamic reconciliation processes Raise the bar on data reliability and governance. Instrument monitoring and alerting, tests (freshness/volume/constraints), and documentation so the compensation data ecosystem is discoverable, auditable, and self-serve. Operate as a trusted partner to leadership. Work directly with Operations and Compensation leadership to scope problems, clarify trade‑offs, and communicate technical concepts in exec‑ready language. Transform workflows by putting AI at the center, building smarter systems and ways of working from the ground up.

Requirements

  • 3–5+ years in analytics/data engineering with production ELT in Snowflake + dbt + SQL; Python for orchestration/utilities.
  • Demonstrated independence partnering with senior, non‑technical leaders; able to translate open‑ended needs into scalable data products.
  • Proven experience implementing tests, monitoring, and documentation that keep pipelines healthy and reporting trustworthy.
  • Experience building data integrations and reverse‑ETL pipelines that support business operations.
  • Airflow (orchestration) and Fivetran/Workato (ELT/integration).
  • Familiarity with data privacy controls (masking/RLS) in people data.
  • AWS experience (S3/EC2/Lambda) and IaC/Terraform.
  • You’ve already experimented with AI in work or personal projects, and you’re excited to dive in and learn fast.
  • You’re hungry to responsibly explore new AI tools and workflows, finding ways to make your work smarter and more efficient.

Responsibilities

  • Integrations & ingestion: Own secure ingestion from HRIS/ATS/comp/performance systems into Snowflake; define SLAs/SLOs; implement monitoring & alerting for each feed.
  • Modeling & marts: Design dimensional/entity models (dbt) for employees, positions, org structure, requisitions/offers, performance/promo history, compensation/equity, and movement; publish curated marts with strong contracts and lineage.
  • Reverse ETL: Operationalize high‑value models to downstream tools and workflows using reverse‑ETL patterns to close the loop between insight and action.
  • Quality & governance: Implement tests (unit/integration, schema/freshness), multi-layered validation frameworks that routinely validate data integrity, data policies (masking, purpose‑based access), and documentation that enable safe self‑service across the analytics community.
  • Repository stewardship: Maintain the analytics codebase (dbt repo), perform code reviews, and ensure modular, reusable patterns the broader team can adopt.
  • Stakeholder partnership: Run an intake & engagement model with Compensation Analytics (primary), HRIS/People Tech (security/integration), Finance (plan/comp interfaces), and BI/Platform teams (shared standards).

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

501-1,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service