Principal Engineer - Data Platforms

Dutch Bros CoffeeTempe, AZ
1dHybrid

About The Position

It's fun to work in a company where people truly believe in what they are doing. At Dutch Bros Coffee, we are more than just a coffee company. We are a fun-loving, mind-blowing company that makes a difference one cup at a time. Position Overview: As the Principal Engineer in the Enterprise Data Platform, you will lead the modernization and optimization of Dutch Bros’ foundational data ecosystem. In this high-impact, hands-on individual contributor role, you will design and operate scalable, resilient infrastructure to power distributed analytics, machine learning, and AI-driven workflows. You are a builder at heart—responsible for writing proof-of-concepts, defining rigorous coding standards, and championing AI-assisted development tools that reduce toil and accelerate team velocity. By partnering with other principal engineers and mentoring the broader team, you will bridge the gap between high-level architecture and seamless execution to deliver a world-class, self-service data experience for the entire organization.

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field.
  • 15+ years of software engineering experience, with a minimum of 10 years specializing in backend distributed systems or data infrastructure at scale.
  • Expert-level Python and SQL proficiency (10+ years) rooted in a career of building production-grade software.
  • Proven track record of designing large-scale data platforms with a deep understanding of CAP theorem, eventual consistency, and the trade-offs between batch and streaming architectures.
  • Hands-on mastery of Snowflake (internals and clustering), dbt (macro design and Jinja), Airflow (scheduler internals), and Power BI (Import vs. Live connection).
  • Comprehensive knowledge of AWS services, including IAM, VPC, Glue, S3, SFTP, Lambda, CloudWatch, and SNS.
  • Experience implementing Gitlab CI/CD and DataDog for robust system monitoring and alerting.
  • Ability to design RAG architectures, manage vector databases, and integrate LLMs into complex data pipelines.
  • Skilled in writing persuasive RFCs and ADRs that drive consensus among architects and engineering leadership.
  • Proven ability to influence technical strategy and facilitate cross-functional alignment across organizational levels without direct managerial authority.

Responsibilities

  • Technical Strategy & Platform Architecture Define the long-term technical architecture for the Enterprise Data Platform, translating business strategy into scalable Data Mesh and domain-oriented specifications.
  • Implement automated CI/CD pipelines and Infrastructure as Code (IaC) to foster a unified engineering culture across disciplines.
  • Design robust APIs that enable seamless data consumption across operations, finance, and product teams.
  • AI-Assisted Development & Integration Lead the integration of AI-assisted development (Cursor, MCP, Copilot) to accelerate developer velocity and reduce cognitive load.
  • Scale LLM-driven code generation for AWS data pipelines, including automated test creation, documentation, and semantic schema generation.
  • Leverage Amazon Bedrock, Snowflake Cortex, and AI-enabled IDEs to optimize the data lifecycle and reduce delivery lead times.
  • Data Engineering at Scale Build resilient ELT/ETL pipelines utilizing S3, Lambda, Glue, dbt, and Airflow (MWAA).
  • Establish data quality, observability, lineage, and SLAs as core, first-class features of the data platform.
  • Standardize enterprise-wide schema design, modeling patterns, and deployment workflows.
  • Machine Learning & Infrastructure Design and productionize end-to-end ML infrastructure, including feature stores, model experimentation frameworks, and deployment monitoring.
  • Build optimized ETL/ELT workflows for training data and model deployment leveraging Snowflake ML (Snowpark) and Amazon SageMaker.
  • Engineering Standards & Technical Influence Enforce high standards for code quality through rigorous PR reviews, unit testing, and automated schema validation within CI/CD pipelines.
  • Architect SRE-focused resiliency frameworks and self-healing systems to ensure 99.9% availability for critical data pipelines.
  • Partner with Data Science, Product, and Platform teams to align on performance, compliance, and feature engineering.
  • Mentor Senior and Lead engineers through design reviews, RFCs, and pair programming to elevate the organization's technical bar.
  • Semantic Engineering & Graph RAG Lead the design and POC of Graph RAG (Retrieval-Augmented Generation) architectures, enabling LLM agents to query structured Snowflake data via Knowledge Graphs.
  • Apply knowledge of open-source Knowledge Graph solutions to create advanced patterns for semantic data retrieval.
  • Coordinate project testing, deployment, and post-implementation activities, ensuring successful project delivery and that lessons learned are applied to future projects.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service