Quality Analytics Lead

WelocalizePortland, OR

About The Position

The Quality Analytics Lead is the dedicated technical resource bridging Welo Data’s Analytics and Quality organizations. Sitting within the Analytics team, this senior IC partners enterprise-wide with Quality Managers, Analysts, and leadership to design and maintain the data models, measurement frameworks, and analytical infrastructure that power evidence-based quality decisions across programs and regions. At its core, this is an analytics engineering role. The primary responsibility is building and owning the quality data layer — the dbt models, data marts, and Python-driven modeling that transform raw operational data into a trusted, well-documented foundation the Quality organization can rely on. Experimentation, stakeholder consulting, and BI delivery are all extensions of that foundation, not parallel tracks. The ideal candidate combines deep fluency in modern data modeling with a genuine understanding of quality operations, AI training data workflows, and experimental design. They ensure the analytical systems they build directly improve how quality teams detect issues, validate improvements, and demonstrate impact to clients and leadership.

Requirements

  • Bachelor’s degree or equivalent work experience in Computer Science, Data Science, Statistics, Engineering, or a related quantitative field.
  • 5+ years in a data analytics, analytics engineering, or data modeling role with demonstrated ownership of analytical data products in a production environment.
  • Proven experience designing and building dbt models, including mart architecture, testing, documentation, and version-controlled development workflows.
  • Strong Python proficiency for data analysis and modeling (e.g., pandas, numpy, statsmodels, or equivalent).
  • Advanced SQL skills for complex analytical queries, data exploration, and data quality validation.
  • Hands-on experience designing and analyzing controlled experiments or A/B tests, including statistical significance testing, power analysis, and practical results interpretation.
  • Demonstrated ability to translate business requirements from non-technical operational stakeholders into well-scoped analytical solutions.
  • dbt (models, marts, tests, documentation)
  • Python (data analysis and modeling)
  • SQL (advanced)
  • Git/version control
  • Technical rigor with operational empathy: the ability to deeply understand quality teams’ day-to-day challenges and translate them into well-designed, purposeful analytical solutions — not over-engineered abstractions.
  • Strong analytical and statistical reasoning, including applied experience with experimental design, performance attribution, and hypothesis testing in messy, real-world operational data.
  • Exceptional communication: able to translate complex data models and analytical findings into plain-language insights for quality managers, senior leadership, and clients across a diverse range of technical literacy levels.
  • Self-directed and proactive: comfortable managing a diverse project backlog with competing priorities, delivering consistently without close supervision, and raising blockers early and clearly.
  • Collaborative and intellectually curious: genuinely interested in understanding quality processes and domain context deeply enough to ask the right questions before building.
  • Growth orientation: excited about building a new function from the ground up, and committed to documenting, scaling, and sharing work in a way that creates lasting organizational value.

Nice To Haves

  • Post-graduate education or equivalent professional experience in analytics, data modeling, or data engineering.
  • Exposure to quality operations, AI training data workflows, annotation platforms, or BPO/localization environments.
  • Familiarity with QA frameworks, sampling methodology, CAPA processes, rubric design, or quality management systems in a data-intensive context.
  • Experience working in an embedded analytics role supporting an operational team, with accountability for both analytical outputs and the underlying data infrastructure.
  • Proficiency with BI tools — Power BI preferred — for delivering analytical outputs to non-technical stakeholders.
  • Familiarity with ELT/pipeline tooling (e.g., Matillion, Fivetran, or equivalent) and how data flows from operational systems into analytics-ready layers.
  • Power BI or equivalent BI platform
  • ELT pipeline tooling
  • statistical modeling libraries (Python)
  • familiarity with data warehouse environments (e.g., Snowflake, BigQuery, or similar)

Responsibilities

  • Quality Data Modeling & Analytics Infrastructure
  • Quality Measurement Frameworks & Metrics Design
  • Experimental Design & Performance Validation
  • Decision Support & Stakeholder Partnership
  • Roadmap Ownership & Continuous Improvement
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service