Senior QA Data Testing Engineer

Growth Acceleration PartnersColorado Springs, CO
6h

About The Position

We are looking for a Senior QA Data Testing Engineer with strong expertise in validating data pipelines, ETL processes, and enterprise Data Warehouse environments. In this role, you will ensure the integrity, reliability, and consistency of the data that feeds analytics and reporting platforms. The primary focus of the position is backend data quality validation, ensuring that datasets entering the Data Warehouse are accurate, complete, and consistent across systems. You will work closely with Data Engineers and Data Analysts to review ETL pipelines, validate data transformations, and identify anomalies within analytical datasets. This role requires a strong analytical mindset and a deep understanding of how data flows across systems and integrations. This position is not focused on BI frontend testing or dashboard validation, but rather on ensuring the data layer powering analytics and reporting systems is reliable and trustworthy.

Requirements

  • 5+ years of experience in QA Engineering with strong focus on data validation and data pipeline testing
  • Proven experience working with Data Warehouses and analytical data models
  • Strong experience validating ETL processes and data integrations feeding a Data Warehouse
  • Experience analyzing large datasets to identify data quality issues, inconsistencies, or anomalies
  • Experience supporting data validation prior to analytics or reporting consumption
  • Experience integrating data testing practices within CI/CD workflows
  • Strong SQL expertise for backend data validation and troubleshooting
  • Experience working with Data Warehouses and analytical data models
  • Hands-on experience validating ETL pipelines and data integrations
  • Experience in Data Quality validation across analytical datasets
  • Strong Python skills for data manipulation or processing
  • Deep understanding of data flows across systems and integrations
  • Strong analytical mindset with Data Engineer / Data Analyst perspective
  • Experience identifying anomalies and inconsistencies in large datasets
  • Experience integrating QA validation within CI/CD pipelines
  • Advanced English proficiency (spoken and written)
  • Strong analytical and investigative mindset
  • Excellent documentation and communication skills
  • Ability to identify risks and proactively address data quality issues
  • Collaborative mindset when working with data and engineering teams
  • Strong attention to detail and problem-solving abilities

Nice To Haves

  • Experience working with Microsoft Fabric
  • Familiarity with Databricks
  • Experience with Azure Data Factory or similar orchestration tools
  • Familiarity with Power BI (to understand downstream data consumption)

Responsibilities

  • Data Quality & Data Warehouse Validation
  • Ensure the quality, integrity, and reliability of data entering the Data Warehouse
  • Validate analytical data models and ensure datasets are consistent and complete
  • Analyze datasets to detect data quality issues, inconsistencies, and anomalies
  • Perform reconciliation and cross-validation between source systems and warehouse outputs
  • Apply data quality principles including completeness, accuracy, consistency, and reliability
  • ETL & Data Pipeline Testing
  • Review and validate ETL pipelines and data integrations feeding the Data Warehouse
  • Test ingestion workflows and transformation logic across data pipelines
  • Validate data transformations across staging, transformation, and warehouse layers
  • Perform source-to-target validation across different environments
  • Ensure pipelines deliver accurate datasets to downstream analytics systems
  • Backend Data Validation
  • Write advanced SQL queries to validate data transformations and dataset integrity
  • Analyze large datasets to identify discrepancies and unexpected patterns
  • Support validation of structured and semi-structured data pipelines
  • Conduct root cause analysis for data discrepancies across systems
  • Automation & Data Validation Tools
  • Develop automated data validation processes using Python for data processing or manipulation
  • Implement repeatable validation checks across data pipelines
  • Support automated testing strategies for ETL and data integrations
  • Integrate testing processes within CI/CD environments
  • Cross-Functional Collaboration
  • Work closely with Data Engineers and Data Analysts to ensure the reliability of analytical data
  • Support validation and testing before data is consumed by analytics platforms or reporting tools
  • Collaborate with governance and engineering teams to maintain high data quality standards
  • Document testing methodologies and validation frameworks
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service