Sr Data Modeler

O'Reilly Auto Parts
2d$108,086 - $180,144

About The Position

The Sr Data Modeler is a key technical contributor responsible for designing, developing, and optimizing conceptual, logical, and physical data models across structured and semi-structured platforms including relational, NoSQL, and real-time systems. This role ensures data models are scalable, governed, and aligned with performance and business requirements. As a senior practitioner, the role partners closely with engineers, stakeholders, and product teams to translate domain-specific data needs into robust models for reporting, analytics, and AI use cases. The Senior Data Modeler also promotes modeling best practices, contributes to data governance efforts, and supports the implementation of hybrid table and streaming-aware data architectures.

Requirements

  • Advanced experience designing logical and physical data models for OLTP, OLAP, and streaming systems, with focus on performance, extensibility, and alignment with platform standards.
  • Strong experience in relational data modeling, including dimensional modeling (star/snowflake), data vault, and normalized structures using modeling tools such as Erwin or UML.
  • Advanced competence in developing and managing data models across data platforms, such as Snowflake, BigQuery, PostgreSQL, and cloud SQL services.
  • Experience with NoSQL and semi-structured data models (e.g., MongoDB, Cassandra), with ability to determine fit-for-purpose based on access patterns and data volume.
  • Basic to intermediate experience with graph databases and modeling concepts, supporting relationship-driven use cases (e.g., Neo4j).
  • Strong experience modeling for analytics and machine learning, including schema design for curated datasets, feature stores, and metric layers in collaboration with analytics and data science teams.
  • Proficient in translating data contracts and business definitions into reusable semantic models that power reporting tools and semantic APIs.
  • Experience incorporating streaming-aware modeling considerations, such as schema evolution, data partitioning, and transformation idempotency, into batch and real-time pipelines.
  • Advanced ability to work with product owners and business stakeholders to translate business requirements into well-structured data entities, relationships, and domains.
  • Strong understanding of enterprise business processes, including customer, marketing, supply chain, and store operations and how data models support analytics, ML, and decision automation.
  • Experience working in agile data product environments, partnering with engineers and analysts to ensure models reflect the evolving needs of the business.
  • Ability to anticipate business implications of schema changes, propose design alternatives, and communicate trade-offs effectively with both technical and business audiences.
  • Experience leading data modeling efforts on cross-functional teams or key domain areas, ensuring quality, consistency, and reuse of modeling assets.
  • Ability to mentor junior data modelers and analysts, conducting model reviews, documenting best practices, and providing architectural guidance.
  • Experience participating in or facilitating modeling and architecture reviews, driving alignment on model design, performance trade-offs, and governance practices.
  • Strong contributor to modeling playbooks, reusable templates, and shared modeling standards that elevate consistency across teams.
  • Experience aligning data models to enterprise taxonomies, data product strategies, and consumption needs, including self-service reporting, semantic layers, and ML pipelines.
  • Strong understanding of data modeling’s role in data governance, including lineage, metadata, versioning, and compliance.
  • Advanced experience integrating semantic models and metrics stores (e.g., dbt metrics, Looker models, AtScale) for reusable KPI delivery.
  • Ability to influence modeling direction by contributing to strategic initiatives, such as KPI harmonization, metric unification, and domain ontology design.

Nice To Haves

  • Experience modeling for hybrid workloads, supporting both transactional and analytical use cases using techniques like flattened views and wide tables.
  • Working knowledge of streaming and event-based modeling patterns, including Kafka schema registry integration and schema evolution design.
  • Familiarity with open table formats such as Apache Iceberg, Delta Lake, or Hudi and their implications for model design and schema portability.
  • Exposure to lineage and metadata integration tools such as Alation, Collibra, or custom metadata registries, supporting model discoverability and traceability.
  • Exposure in enabling LLM-ready data assets, such as structured knowledge graphs and semantically rich entities used by AI agents or copilots.
  • Demonstrated ability to support platform migrations or modeling refactoring efforts, including legacy to cloud transformations.

Responsibilities

  • Design domain-level conceptual, logical, and physical data models across OLTP and OLAP systems, with emerging support for streaming and hybrid workloads.
  • Apply best practices in relational modeling using tools such as Erwin, dbt, and UML, ensuring alignment with medallion or data mesh architecture principles.
  • Implement multi-model data environments that span relational (e.g., Snowflake, BigQuery, PostgreSQL), NoSQL (MongoDB, Cassandra), graph (e.g., Neo4j), and event-based (e.g., Kafka, Pub/Sub) systems.
  • Develop dimensional models, normalized schemas, and de-normalized views tailored for operational reporting, dashboarding, and analytical queries.
  • Collaborate with platform and engineering teams to ensure models support schema evolution, model extensibility, and efficient query performance.
  • Translate business requirements and analytics use cases into well-structured data models, ensuring semantic consistency across domains.
  • Recommend modeling techniques and platform selection (relational vs NoSQL vs streaming) based on performance, data type, and user needs.
  • Work closely with engineers and product owners to ensure model designs support KPI alignment, reusability, and future-state scalability.
  • Lead and implement modeling requirements for feature stores and analytic datasets used in analytics, AI and machine learning pipelines.
  • Maintain detailed documentation including entity definitions, data dictionaries, model lineage, and change logs in cataloging tools (e.g., Alation, Collibra).
  • Contribute to the enforcement of modeling standards such as naming conventions, schema versioning, and semantic layering practices.
  • Support governance efforts through consistent metadata management, model certification, and stewardship handoff documentation.
  • Execute schema governance processes to ensure backward compatibility and data trust across ingestion and consumption layers.
  • Develop performant physical data models for Snowflake, BigQuery, PostgreSQL, and other modern cloud-native data warehouses and platforms.
  • Collaborate with data engineers to implement optimal indexing, clustering, partitioning, and table design strategies.
  • Contribute in troubleshooting performance issues related to model complexity, data skew, or inefficient joins in reporting and data science pipelines.
  • Support continuous improvement of data models by analyzing access patterns, profiling large datasets, and proposing schema refinements.
  • Work with engineering teams to embed models into ingestion pipelines, transformation layers, and semantic APIs.
  • Validate that dbt models, ETL/ELT logic, and CI/CD deployment scripts accurately reflect logical and physical designs.
  • Support integration of models with real-time systems (e.g., Kafka, Pub/Sub) and ensure models function across batch and streaming environments.
  • Participate in quality assurance cycles by reviewing test coverage, edge case handling, and production readiness of model implementations.
  • Contribute to the development of reusable semantic models for metrics stores, self-service BI tools, and advanced analytics layers.
  • Help unify metric definitions and business logic across systems through dimensional modeling and modular dbt workflows.
  • Work with analytics engineers to align modeling logic with metric stores and dashboards, enabling consistent performance and insight delivery.
  • Contribute to graph and document modeling efforts as needed for use cases such as product attribution, recommendation, or customer graph enrichment.
  • Embed structural validation, referential integrity checks, and schema verification into the development lifecycle for all new data models.
  • Collaborate with engineers and platform teams to ensure data health monitoring (e.g., freshness, null tracking, type mismatches) is modeled at the schema level.
  • Support automated testing and CI/CD integration of models using tools like dbt, Great Expectations, or similar frameworks.
  • Participate in resolving modeling-related issues around schema drift, inconsistent joins, or conflicting metric definitions.
  • Serve as a mentor and resource to junior data modelers and engineers, providing guidance on modeling fundamentals, SQL performance, and semantic alignment.
  • Contribute to modeling playbooks, reusable templates, and internal knowledge repositories.
  • Participate in technical reviews and modeling community of practice discussions, driving awareness of emerging techniques.
  • Stay up to date with modern modeling techniques for streaming, graph, document, and multi-model databases, applying new methods as appropriate.

Benefits

  • Competitive Wages & Paid Time Off
  • Stock Purchase Plan & 401k with Employer Contributions Starting Day One
  • Medical, Dental, & Vision Insurance with Optional Flexible Spending Account (FSA)
  • Team Member Health/Wellbeing Programs
  • Tuition Educational Assistance Programs
  • Opportunities for Career Growth
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service