Domino's-posted 8 days ago
$140,000 - $155,000/Yr
Full-time • Manager
Hybrid • Ann Arbor, MI
5,001-10,000 employees

As a Manager – Data Quality and Operations focused on enterprise data solutions, your primary responsibility will be to ensure the delivery of high-quality, reliable, and efficient data pipelines and operations across the organization. This is a senior technical leadership role, accountable for the end-to-end data quality engineering and operational excellence of cloud-based data solutions. The ideal candidate will have hands-on experience in large-scale data pipeline management, automated data quality assurance, and production operations, with a proven track record of leading cross-functional teams to drive key decisions and continuous improvement. The Manager – Data Quality and Operations will partner closely with Data Engineering, Platforms, Analytics, and Digital/AI/ML teams to define and implement best practices for data quality, automated testing, and operational support, enabling trusted data activation across the enterprise. Location: Domino’s World Resource Center; 30 Frank Lloyd Wright Dr, Ann Arbor, MI 48105 Shift: Fulltime; Salary Job Posting Salary: $140,000-$155,000, plus bonus Role: Hybrid (4 Days at Dominos Headquarters, Ann Arbor) Friday, remote

  • Lead Data Quality Engineering & Data QA
  • Build quality in: Ensure data pipelines are engineered with quality-first principles and are functionally aligned to business and technical requirements.
  • Design quality controls: Define, implement, and maintain automated QA checks for critical data assets with thresholds and SLA-aligned alerting and escalation.
  • Enterprise data quality framework: Establish best practices and measurable DQM standards (profiling, validity, completeness, timeliness, consistency, accuracy) across domains.
  • Test automation at scale: Drive in-sprint and regression automation for batch and streaming workloads; integrate tests into CI/CD to prevent regressions and accelerate release cycles.
  • Coach and develop talent: Lead a pod of QA/Data Quality specialists; raise technical bar in SQL/Python, test design, and root-cause analysis.
  • Run Data Operations
  • Own production SLAs: Monitor and support an extensive footprint of pipelines; ensure uptime and on-time delivery for key datasets, metrics, and downstream products.
  • Triage & remediate fast: Lead incident response for data quality/availability issues; drive RCA and corrective actions; reduce MTTR through automation and playbooks.
  • Analyze & prevent: Apply EDA to quantify impact (blast radius), identify failure patterns, and implement preventive controls and observability.
  • Harden the pipeline factory: Mature CI/CD (branching, approvals, quality gates) and release automation; improve MFT and orchestration flows for reliability and throughput.
  • Build the team: Recruit, onboard, and mentor Data Operations Analysts to support enterprise data modernization initiatives at scale.
  • Participate in an on-call rotation for critical data products and platform components.
  • Hands-on technical leadership in data engineering, QA/quality engineering and data operations.
  • Deep proficiency in SQL, ETL Tools and Python for test automation, data validation, and triage.
  • Strong experience with ETL/ELT and orchestration (e.g., Control-M, Airflow, Databricks Jobs).
  • CI/CD pipelines for data (Git/GitHub, Jenkins/GitHub Actions) including quality gates and automated regressions.
  • Familiarity with MFT platforms and secure file transfer patterns.
  • Proven track record building DQ rulesets (profiling, constraints, anomaly detection) and putting them into production with monitoring and alerting.
  • Production support experience: incident management, RCA, and post-mortems with action tracking and verification.
  • Strong problem-solving skills; ability to translate requirements into executable tests and controls.
  • Builder’s mindset with bias for automation and measurable outcomes.
  • Clear, candid communicator; able to translate between engineering detail and business impact.
  • Empathetic coach who raises the team’s technical bar and problem-solving capability.
  • Drives a culture of documentation, observability, and continuous improvement.
  • BS/MS in Computer Science, Information Systems, Data/Analytics, or equivalent experience.
  • 7–10+ years in data engineering/operations with 3+ years leading QA/DQ or SRE-like functions for data systems.
  • Experience with cloud data platforms (Azure/AWS/GCP) and cloud data warehouses/lakehouses; Databricks strongly preferred.
  • Familiarity with data warehousing, dimensional modeling, and performance tuning.
  • Exposure Customer 360/MDM and enterprise data governance.
  • Experience with BI/semantic layers and data product SLAs.
  • Background working with streaming (Kafka, Event Hubs) and schema management.
  • Paid Holidays and Vacation
  • Medical, Dental & Vision benefits that start on the first day of employment
  • No-cost mental health support for employee and dependents
  • Childcare tuition discounts
  • No-cost fitness, nutrition, and wellness programs
  • Fertility benefits
  • Adoption assistance
  • 401k matching contributions
  • 15% off the purchase price of stock
  • Company bonus
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service