Senior Data Engineer II

Narvar
Remote

About The Position

Narvar is seeking a Senior Data Engineer to own and evolve the data pipelines, platforms, and data products that power Narvar’s analytics, ML, and merchant-facing products. This role involves working across the full stack of data infrastructure, from ingestion and transformation to the analytics surfaces merchants use daily. The engineer will make architectural decisions, ship production systems at scale, and is expected to work in an AI-native way, using agentic coding tools to increase leverage and ship faster. The position is focused on the post-purchase experience, processing terabytes of transactional data daily and building data products that merchants interact with directly.

Requirements

  • 5–8 years of experience building and operating production data systems
  • Strong SQL skills
  • Proficient in Python, with flexibility to pick up other languages as needed
  • Comfortable building and maintaining APIs
  • Experience with modern data stacks on cloud platforms (GCP preferred, but AWS or Azure transfers well), including cloud data warehouses like BigQuery, ELT patterns, and orchestration with Airflow
  • Deep understanding of data modeling — dimensional modeling, slowly changing dimensions, incremental processing
  • Treat data quality, lineage, and observability as first-class engineering concerns
  • Ability to communicate clearly with technical and non-technical stakeholders and are comfortable working cross-functionally
  • Already use AI and agentic coding tools as a core part of how you work for planning, code generation, debugging, and code review

Nice To Haves

  • Experience in a startup or high-ownership environment where you wore multiple hats
  • Experience building or maintaining embedded analytics or multi-tenant data products
  • Excited about making data accessible to AI systems, whether that’s through better metadata, semantic layers, or preparing datasets for agentic workflows
  • Care about cost optimization and data governance

Responsibilities

  • Design, build, and operate data pipelines that process terabytes of transactional data daily using Airflow/Composer and BigQuery
  • Own end-to-end data models and transformations that power merchant analytics, operational reporting, and ML features
  • Build and maintain embedded analytics infrastructure — the data products our merchants interact with directly
  • Evolve our data platform on GCP, including BigQuery, Cloud SQL, AlloyDB, and CDC datastreams
  • Improve data quality and reliability through testing, observability, alerting, and validation frameworks
  • Own data lineage, metadata, and documentation, and help prepare our data layer for agentic and LLM-powered use cases with semantic clarity and standardized metric definitions
  • Collaborate cross-functionally with product, ML, and GTM teams, and contribute to technical direction through design docs and architecture decisions

Benefits

  • Annual bonus
  • Equity
  • Health insurance
  • Dental insurance
  • Vision insurance
  • Paid holidays
  • Professional development
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service