About The Position

We are looking for a Senior Data Governance and Quality Engineer to play a critical role in maintaining the integrity, accuracy, and reliability of data throughout our enterprise data ecosystem. This role involves designing and implementing scalable data quality frameworks, driving governance initiatives, and collaborating with cross-functional teams to operationalize high standards of data quality. The ideal candidate will possess a strong understanding of data governance principles, data lifecycle management, and end-to-end data transformation processes, along with hands-on experience in deploying enterprise-level data quality solutions. As a member of our engineering team, you will be responsible for coaching, mentoring, managing, and leading team members within an agile environment.

Requirements

  • Education: Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related field.
  • Experience: Minimum 5+ years of experience in Data Quality, Data Governance, or Data Engineering roles.
  • Strong hands-on experience with Databricks, PySpark, Python, and SQL
  • Technical Skills: o Proficiency in SQL, Python, and modern data pipeline tools. o Familiarity with data governance and quality tools (e.g., Collibra, Informatica, Alation, Great Expectations). o Experience working with cloud platforms (AWS, Azure, or GCP). o Strong understanding of BI and reporting tools such as Power BI, Tableau
  • Domain Expertise: o In-depth knowledge of data lifecycle management, data lineage, and metadata management. o Awareness of current and emerging data governance and data management best practices.
  • Soft Skills: o Strong analytical and problem-solving mindset. o Excellent communication and stakeholder engagement abilities. o Proven capability to operate independently and deliver results in a dynamic environment.

Responsibilities

  • Data Quality Framework Development: Develop, implement, and sustain a comprehensive data quality framework to systematically monitor, validate, and enhance data accuracy and consistency throughout all systems. Develop and maintain scalable data quality solutions utilizing Databricks and Apache Spark, primarily leveraging PySpark.
  • Governance & Compliance: Operationalize the enterprise data governance framework, aligning with stakeholder needs related to data quality, access controls, compliance, privacy, and security.
  • Data Monitoring & Auditing: Identify and address data anomalies, inconsistencies, duplicates, and missing values. Conduct periodic audits to ensure ongoing data integrity.
  • Cross-Functional Collaboration: Partner with data engineers, architects, product teams, and analysts to define data quality requirements and ensure alignment with business objectives.
  • Insight Generation: Develop clear and actionable dashboards and reports (e.g., Power BI, Salesforce) to visualize data quality trends, KPIs, and issue resolution progress.
  • Root Cause Analysis: Collaborate with data stewards and product owners to investigate and resolve data quality issues, establishing sustainable remediation processes.
  • Technical Expertise: Apply strong understanding of data models (e.g., star schema, snowflake, data marts, data lakes) to evaluate and improve data structures and flows.
  • Process Ownership: Take ownership of assigned initiatives, break complex challenges into manageable components, and execute plans effectively with minimal supervision.

Benefits

  • Unlimited paid time off – recharge when you need it
  • Work from anywhere – flexibility to fit your life
  • Comprehensive health coverage – multiple plan options to choose from
  • Equity for every employee – share in our success
  • Growth-focused environment – your development matters here
  • Home office setup allowance – one-time support to get you started
  • Monthly cell phone allowance – stay connected with ease
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service