Senior Data Architect

ŌuraSan Francisco, CA

About The Position

Our mission at Oura is to empower every person to own their inner potential. Our award-winning products help our global community gain a deeper knowledge of their readiness, activity, and sleep quality by using their Oura Ring and its connected app. We've helped millions of people understand and improve their health by providing daily insights and practical steps to inspire healthy lifestyles. Empowering the world starts with living our values and empowering our team. As a quickly growing company focused on helping people live healthier and happier lives, we ensure that our team members have what they need to do their best work — both in and out of the office. About the Role We are seeking an experienced Senior Data Architect as part of our unified data mesh platform. Reporting to the Sr. Director of Data Management, this role will be responsible for setting the data foundations and models for our Data Products to accelerate our business growth, deepen our product and membership understanding, and optimize our business operations. We are looking for a Data Architect/Modeler with deep expertise in modern cloud architectures and the Data Mesh approach. You will be responsible for designing the structural foundations of our data products, ensuring they are interoperable, scalable, and trustworthy. You will bridge the gap between complex business requirements and high-performance technical design, acting as the primary blueprint designer for our global data lifecycle.

Requirements

  • Experience: 8+ years of experience in data architecture or modeling, with a strong technical foundation in cloud-based platforms (AWS, GCP, Databricks or Azure).
  • Cloud Platform & Infrastructure Mastery Multi-Cloud Expertise: Hands-on expertise in major cloud platforms including AWS (S3, Kinesis, Glue, Athena), GCP (BigQuery, VertexAI), or Azure .
  • Modern Data Warehousing: Proficiency in designing and managing cloud-native warehouses like Snowflake or Google BigQuery .
  • Lakehouse Architecture: Ability to build and operate a Unified Global Lakehouse that merges the flexibility of a data lake with the management of a warehouse.
  • Containerization & Workflows: Experience with Docker, Pulumi and various workflow engines to manage complex data processing tasks.
  • Data Modeling & Strategy Data Mesh Principles: Familiarity with the Data Mesh approach , specifically managing federated data governance and decentralized data ownership.
  • Lifecycle Management: Capability to lead the entire data lifecycle, from initial data definition to final delivery and consumption.
  • Standardization: Expertise in Master Data Management (MDM) and Reference Data Management (RDM) to ensure consistency across the enterprise.
  • Schema Design: Proficiency in using modern formats like Iceberg and transformation tools like dbt to maintain high-quality data structures including dbt Cloud on Databricks for SQL-based modeling (bronze/silver/gold)
  • Advanced Analytics & AI Readiness AI/ML Integration: Experience with production-quality AI/ML and predictive modeling, leveraging platforms like VertexAI and MLOps frameworks.
  • LLM & NLP Design: Skill in designing architectures for Large Language Models (LLM) using Retrieval Augmented Generation (RAG) and vector-based data designs.
  • Automated Insights: Ability to design systems for predictive analytics , anomaly detection, and automated reporting.
  • Agentic AI: Familiarity with Agentic AI to deliver interactive, intelligent dashboards and self-serve capabilities.
  • Governance, Security & Compliance Data Residency: Knowledge of global data residency requirements and privacy standards.
  • Regulatory Standards: Expertise in establishing HIPAA and PHI (Protected Health Information) standards within regulated environments.
  • Quality Assurance: Championing best practices for data accuracy, reliability, and trustworthiness through rigorous validation and peer review.
  • Technical Foundations & Tools Programming & Processing: Broad knowledge of software fundamentals and stream processing using Kafka , Kinesis , Python , Spark and SQL .
  • Integration/Integration: Expertise in integrating diverse data sources, including transactional, product, and compliance data into a centralized function using tools such as dbt, fivetran Orchestration : job and task automation and scheduler design using tools such as Airflow, Dagster, dbt, Databricks Lakeflow Observability : Design for high availability and performance bottlenecks including long running high cost tasks.
  • Distributed processing: Spark (via Databricks) for large-scale ETL/ML.

Nice To Haves

  • Experience in the healthcare, wellness, consumer electronics, wearables, digital health, or subscription services industries.
  • Broad knowledge of software fundamentals, databases, warehouses, and system design with experience on various programming languages
  • Experience streamlining multiple data pipelines into a centralized function that allows effective and efficient oversight of business processes and product development; expertise in integrating diverse data sources, including product, transactional, and compliance data
  • Embedded analytics into product, finance, sales, marketing, business operations, and customer experience
  • Ability to navigate hypergrowth while managing regulatory constraints
  • Strong technical foundation, demonstrated through education or early-career hands-on roles
  • Experience with production-quality AI/ML, predictive modeling, and leveraging analytics tools like VertexAI, MLOps, MLFlow, dbt, Microstrategy, Tableau, and Power BI, Thoughtspot
  • Experience leading the development of a robust and scalable data platform that can support the organization’s growing data needs with cost tiering
  • Advanced degree in Engineering, Data Science, Statistics, Computer Science, or a related field is preferred

Responsibilities

  • Architect & Model: Design and manage data domains to enable the creation of interoperable, trustworthy data products.
  • Cloud Infrastructure: Build and optimize Oura’s Data Lakehouse leveraging Databricks, Google Big Query , and Snowflake to process Terabyte-Petabyte scale data.
  • Data Mesh Governance: Implement federated data governance within the data mesh to ensure processes meet privacy, compliance (HIPAA/PHI), and security requirements.
  • Collaborate: Partner with Data Engineering, Data Science, and Business Domain owners to advocate for unified analytics and modeling best practices.
  • AI Readiness: Design vector-based data architectures and Retrieval Augmented Generation (RAG) patterns to enable LLM reporting and Agentic AI.
  • Standardization: Establish scalable data management frameworks and a governed data dictionary to enable organizational self-service.

Benefits

  • At Oura, we care about you and your well-being. Everyone here at Oura has a ring of their own and we are continually looking to improve employee health.
  • Competitive salary and equity packages
  • Health, dental, vision insurance, and mental health resources
  • An Oura Ring of your own plus employee discounts for friends & family
  • 20 days of paid time off plus 13 paid holidays plus 8 days of flexible wellness time off
  • Paid sick leave and parental leave
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service