Data Engineer

UJET
$140,000 - $160,000Remote

About The Position

UJET is looking for a Software Engineer to join our growing Data Platform team. This role is ideal for an engineer who enjoys building the data platform and the infrastructure around it. You will help build scalable, reliable data systems and workflows that enable others (analysts, product partners, and engineers) to confidently create and maintain dbt models. You will also be hands-on in developing dbt models yourself, especially for high-impact, cross-functional datasets. Our data stack includes BigQuery, dbt (dbt Labs), Python, and Looker, running on Google Cloud Platform (GCP). Experience with Ruby (or Ruby on Rails) and additional GCP services is a plus.

Requirements

  • 5+ years of experience as a Software Engineer with deep data platform experience
  • Strong programming skills in Python
  • SQL skills and experience with analytical data modeling
  • Hands-on experience with BigQuery (or a similar cloud data warehouse, with willingness to ramp)
  • Production experience with dbt (dbt Labs), including building models yourself
  • Experience building or improving the infrastructure around data workflows (reliability, observability, CI/CD, permissions, environments, deployment patterns)
  • Strong software engineering fundamentals (testing, version control, code reviews)
  • Experience with GCP services such as Cloud Storage, Cloud Functions, Cloud Run, Pub/Sub, Composer

Nice To Haves

  • Experience with Looker (LookML, semantic modeling, dashboards)
  • Experience collaborating with Product to gather requirements and translate business needs into data solutions
  • Experience with streaming or event-based architectures
  • Experience with Ruby or Ruby on Rails

Responsibilities

  • Data platform enablement
  • Build and evolve the data platform foundations that make it easy and safe for others to ship dbt work (project structure, environments, permissions, patterns, documentation)
  • Establish and maintain standards and guardrails for dbt development (testing strategy, source freshness, documentation, code review practices)
  • Improve the developer experience for data workflows, including CI/CD, automated checks, and repeatable deployment patterns
  • Trusted metrics and analytics
  • Partner with Analytics and Finance to define and deliver trusted metrics and dashboards in Looker
  • Build and maintain analytics-ready datasets that support self-serve reporting and experimentation
  • Data modeling and transformation (dbt + BigQuery)
  • Develop and optimize BigQuery data models for analytics and product use cases
  • Implement ELT best practices in dbt, including testing, documentation, and versioning
  • Pipelines, reliability, and cost
  • Design, build, and maintain scalable data pipelines using Python and dbt
  • Ensure data quality, reliability, and observability for critical datasets and reporting
  • Optimize performance and cost across BigQuery and data pipelines
  • Cross-functional delivery
  • Integrate data workflows with backend services and APIs
  • Participate in infrastructure decisions related to data ingestion and platform evolution

Benefits

  • Medical, dental, vision
  • 401(k) plan
  • commuter benefits
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service