Data Engineer III

Wayvia
$130,000 - $160,000Remote

About The Position

The Data Engineer III will lead the development and maintenance of our data infrastructure and tools, ensuring data integrity and availability while driving the strategic direction of data initiatives.

Requirements

  • BS degree in Computer Science or related field or equivalent practical experience.
  • 5+ years of data engineering experience, with a track record of designing, building, and maintaining ELT pipelines and dimensional data models in production environments
  • Expertise in processing and integrating large-scale data from multiple sources.
  • Advanced SQL query skills with complex joins and common-table expressions.
  • Strong hands-on Snowflake experience (prior admin experience a plus, but not required)
  • Proficiency with dbt — production-grade model development, testing, documentation, and project governance
  • Proficient coding skills in Python, Java, or similar languages.
  • Proficiency in Git-based version control and CI/CD pipelines for data infrastructure
  • Familiarity with workflow orchestration tools or Data Orchestration Platform — (Airflow/Perfect/Dagster, etc)
  • Strong sense of data ownership — you treat data as a product and hold a high bar for quality and reliability

Nice To Haves

  • Experience with data observability and lineage tools
  • Knowledge of AI/ML
  • Experience building or contributing to a semantic layer or canonical data model, with emphasis on consistent metric and entity definitions across consumers
  • Experience with Looker development or administration
  • Background in e-commerce, retail analytics, or SaaS data platforms

Responsibilities

  • Lead the design, development, testing, deployment, maintenance, and improvement of advanced data engineering solutions and pipelines.
  • Prioritize, manage, and deliver multiple high-impact projects.
  • Establish data quality standards — automated tests, anomaly detection, and freshness SLAs
  • Write and maintain dbt models, tests, and documentation to ensure data quality and consistency
  • Identify infrastructure improvements to reduce cost, improve reliability, and eliminate single points of failure
  • Build and govern canonical dbt models in Snowflake using dimensional modeling principles, ensuring consistent metric definitions and entity relationships across Looker and the MCP server for both BI and AI consumers
  • Own dbt model governance — standardize modeling patterns, enforce testing and documentation requirements, and manage the project structure across domains
  • Define and maintain a consistent semantic layer — metric definitions, business logic, and entity relationships — ensuring alignment across BI tools, the MCP server, and AI consumers regardless of the tooling layer
  • Support Snowflake administration/Looker administration alongside other data platform team members

Benefits

  • Flexible work-from-home arrangements
  • 401K Match
  • Flexible vacation
  • Medical/Dental/Vision
  • 16 weeks of paid parental leave (US)
  • Technical stipend
  • Professional development programs
  • Wellness programs
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service