Data Engineering Lead

QXORutherford, NJ

About The Position

QXO is a publicly traded company founded by Brad Jacobs, aiming to build the market-leading company in the building products distribution industry. Following its first acquisition of Beacon Building Products, QXO is focused on creating a customer-focused, tech-enabled, and innovation-driven business that will scale rapidly through accretive M&A, organic growth, and greenfield expansion. The company's strategy emphasizes delivering exceptional customer experiences, improving operational efficiency, and leveraging data, digital tools, and AI to modernize a historically under-digitized industry. The Data Engineering Lead will be responsible for designing, building, and operating scalable, cloud-native data platforms and data products on Google Cloud. This role combines hands-on engineering with technical leadership, setting standards and guiding best practices while partnering closely with analytics, product, and business teams to deliver trusted, analytics-ready data at scale. The lead will play a key role in shaping data architecture, improving data quality and reliability, and enabling BI, analytics, and AI use cases across the organization.

Requirements

  • Programming & Processing: Python, SQL, Apache Spark (PySpark)
  • GCP Data Stack: BigQuery, Google Cloud Storage (GCS), Dataflow, Pub/Sub, Cloud Composer (Airflow)
  • Data Modeling & Analytics Engineering: dbt, analytics-ready and semantic data modeling
  • Metadata, Lineage & Quality: OpenMetadata, OpenLineage, data quality testing, data observability
  • DevOps & Platform Practices: Git, CI/CD for data pipelines, Infrastructure-as-Code (Terraform preferred), Docker fundamentals
  • Security & Governance: Data access controls, data contracts, schema enforcement, governance-by-design

Nice To Haves

  • Experience with real-time / streaming architectures
  • Exposure to ML/AI data pipelines and feature engineering
  • API-based and SaaS data integrations
  • Familiarity with BI tools (e.g., Looker)
  • Multi-cloud or hybrid data environments

Responsibilities

  • Design, build, and maintain batch and streaming data pipelines on GCP
  • Lead end-to-end data product engineering, from ingestion through transformation and consumption layers
  • Define and enforce data engineering standards, patterns, and best practices
  • Implement data quality, metadata, lineage, and observability as part of pipelines
  • Own orchestration, CI/CD, and release processes for data workloads
  • Partner with analytics, BI, ML, and business teams to deliver analytics-ready datasets
  • Provide technical leadership and mentorship to data engineers across teams

Benefits

  • Annual performance bonus
  • Long term incentive (equity/stock)
  • 401(k) with employer match
  • Medical, dental, and vision insurance
  • PTO, company holidays, and parental leave
  • Paid Time Off/Paid Sick Leave: Applicants can expect to accrue 15 days of paid time off during their first year (4.62 hours for every 80 hours worked) and increased accruals after five years of service.
  • Paid training and certifications
  • Legal assistance and identity protection
  • Pet insurance
  • Employee assistance program (EAP)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service