Principal Software Engineer, Data

QXOSeattle, WA
6h$175,000 - $400,000

About The Position

We’re looking for bold, entrepreneurial talent ready to help build something extraordinary — and reshape the future of building products distribution. QXO is a publicly traded company founded by Brad Jacobs with the goal of building the market-leading company in the building products distribution industry. On April 30, 2025, QXO completed its first acquisition: Beacon Building Products, a leading distributor in the sector. We are building a customer-focused, tech-enabled, and innovation-driven business that will scale rapidly through accretive M&A, organic growth, and greenfield expansion. Our strategy is rooted in delivering exceptional customer experiences, improving operational efficiency, and leveraging data, digital tools, and AI to modernize a historically under-digitized industry. What you'll do: We are seeking a highly experienced principal software engineer, data to design and implement a modern data engineering stack that enables scalable, efficient, and high-performance data processing. The ideal candidate will be responsible for creating and optimizing scalable data structures for both structured and unstructured data derived from our acquired businesses. This role is critical in supporting machine learning and AI applications, ensuring a robust, well-architected, and optimized data infrastructure that powers data science and AI teams to develop high-performing models. As a technical leader, the principal data engineer will architect and build large-scale data systems that efficiently process massive datasets for AI-driven applications. The ideal candidate will drive the design, development, and implementation of next-generation data solutions that enable seamless AI model training, inference, and real-time data processing at scale.

Requirements

  • Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.
  • 7+ years of data engineering experience building data infrastructure, data warehouses, pipelines, and analytics solutions at scale.
  • Proven track record delivering large data solutions (data warehouse implementations, data models, architectures, data flows) in difficult or ambiguous problem areas.
  • Deep proficiency in SQL and at least one programming language (Python, R) for data processing and orchestration.
  • Strong understanding of distributed systems (MapReduce, MPP architectures, NoSQL databases).
  • Demonstrated ability to design logical and physical data models with expertise in data warehouse optimization techniques (partitioning, distribution, indexing).
  • Experience with data persistence technologies including modern data warehouses and integration patterns (ETL, streaming, federation).
  • Track record of taking ownership of data architecture and driving simplification.
  • Strong ability to communicate technical ideas, build consensus, and influence technical strategy.

Nice To Haves

  • Bachelor's degree in computer science, Computer Engineering, or related technical field.
  • Experience with GCP data services (BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub) and infrastructure-as-code.
  • Experience with modern data warehouse platforms (BigQuery, Snowflake, Redshift), data warehouse design patterns, and data modeling/ETL tools (e.g. DBT).
  • Technical leadership experience influencing 1-2 teams through collaborative development or driving best practices.
  • Experience with data lakehouse architectures, data governance, lineage tracking, and quality frameworks.
  • Proven mentorship track record with the ability to develop technical talent.
  • Contributions to the technical community through patents, publications, or open-source work.

Responsibilities

  • Take ownership of team data architecture, including data warehouse design, providing system-wide design guidance and ensuring data is auditable, available, and accessible.
  • Design and deploy large-scale data solutions including data warehouse implementations in ambiguous problem areas with significant impact on data quality, availability, or business value.
  • Lead logical and physical data model design for data warehouse and data lakes, optimizing query performance, storage efficiency, and data integrity.
  • Anticipate data access patterns and proactively remove bottlenecks, ensuring smooth data flow from operational sources to data warehouse and analytical endpoints. Drive data engineering best practices including Data Discovery, Naming Conventions, Data Quality frameworks, Operational Excellence, and Security standards.
  • Make architectural trade-offs between short-term technology needs and long-term business needs (build vs. buy, storage strategies, data warehouse vs. data lake technologies). Lead design reviews for team solutions and related software systems, bringing clarity to complexity and fostering shared understanding.
  • Build consensus across discordant views and resolve root causes of endemic problems, including those that unblock innovation of related teams.
  • Mentor and develop engineers, improving their technical skills and understanding of data engineering best practices.
  • Provide technical assessments for promotion candidates and participate in hiring processes.
  • Develop efficient data models and ensure governance, quality, security, and compliance.
  • Continuously optimize infrastructure for cost, scalability, and performance; implement CI/CD and automation for data workflows.
  • Establish data engineering best practices, including testing, deployment, documentation, and metadata management.
  • Build tools and frameworks to streamline data transformation and accelerate AIdriven decisionmaking; evaluate emerging technologies.
  • Ensure performant systems through smart storage, caching, and streaming design choices.

Benefits

  • Annual performance bonus
  • Long term incentive (equity/stock)
  • 401(k) with employer match
  • Medical, dental, and vision insurance
  • PTO, company holidays, and parental leave
  • Paid Time Off/Paid Sick Leave: Applicants can expect to accrue 15 days of paid time off during their first year (4.62 hours for every 80 hours worked) and increased accruals after five years of service.
  • Paid training and certifications
  • Legal assistance and identity protection
  • Pet insurance
  • Employee assistance program (EAP)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service