Senior Data Quality Engineer

Growth Acceleration PartnersColorado Springs, CO
8d

About The Position

We are looking for a Senior Data Quality Engineer with strong expertise in validating data pipelines, ETL processes, and enterprise Data Warehouse environments. In this role, you will be responsible for ensuring the integrity, reliability, and consistency of the data powering analytics and reporting platforms. The primary focus of this position is validating the quality of data as it moves across pipelines, transformations, and warehouse layers. You will work closely with Data Engineers and Data Analysts to analyze datasets, review ETL pipelines, validate transformations, and detect anomalies within analytical data environments. This role requires a strong analytical mindset and a deep understanding of how data flows across systems and integrations. This is not a traditional QA role focused on application testing or BI dashboards. Instead, it is a data-focused role centered on validating the quality, completeness, and accuracy of analytical datasets before they are consumed by reporting or analytics systems.

Requirements

  • 5+ years of experience working with data validation, data quality, or data pipeline reliability
  • Proven experience working with Data Warehouses and analytical data models
  • Strong experience validating ETL pipelines and data integrations feeding a Data Warehouse
  • Experience analyzing large datasets to identify data quality issues, inconsistencies, or anomalies
  • Experience supporting data validation processes prior to analytics or reporting consumption
  • Experience integrating data validation checks into CI/CD workflows
  • Strong SQL expertise for data validation and troubleshooting
  • Experience working with Data Warehouses and analytical data models
  • Hands-on experience validating ETL pipelines and data integrations
  • Experience in Data Quality validation across analytical datasets
  • Strong Python skills for data processing and validation
  • Deep understanding of data flows across systems and integrations
  • Strong analytical mindset with Data Engineer / Data Analyst perspective
  • Experience identifying anomalies and inconsistencies in large datasets
  • Experience integrating data validation into CI/CD pipelines
  • Advanced English proficiency (spoken and written)
  • Strong analytical and investigative mindset
  • Excellent documentation and communication skills
  • Ability to identify risks and proactively address data quality issues
  • Collaborative mindset when working with data and engineering teams
  • Strong attention to detail and problem-solving abilities

Nice To Haves

  • Experience working with Microsoft Fabric
  • Familiarity with Databricks
  • Experience with Azure Data Factory or similar orchestration tools
  • Familiarity with Power BI to understand downstream data consumption

Responsibilities

  • Ensure the accuracy, completeness, and consistency of data entering the Data Warehouse
  • Validate analytical datasets and data models across warehouse environments
  • Analyze datasets to detect data quality issues, inconsistencies, and anomalies
  • Perform reconciliation and cross-validation between source systems and warehouse outputs
  • Apply data quality standards including completeness, accuracy, consistency, and reliability
  • Review and validate ETL pipelines and data integrations feeding the Data Warehouse
  • Analyze ingestion workflows and transformation logic across data pipelines
  • Validate transformations across staging, processing, and warehouse layers
  • Perform source-to-target data validation across environments
  • Ensure pipelines deliver high-quality datasets to downstream analytics systems
  • Write advanced SQL queries to validate datasets and transformations
  • Investigate discrepancies across multiple data sources
  • Identify anomalies, unexpected patterns, or data inconsistencies
  • Perform root cause analysis on data quality issues across systems
  • Develop automated data validation processes using Python
  • Implement repeatable validation checks across data pipelines
  • Build monitoring checks for data completeness and integrity
  • Integrate validation processes into CI/CD pipelines
  • Partner with Data Engineers and Data Analysts to maintain reliable analytical datasets
  • Support validation activities before data is consumed by analytics platforms or reporting tools
  • Collaborate with governance teams to maintain data quality standards
  • Document validation rules, processes, and methodologies
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service