Senior Data Platform Engineer

10x GenomicsPleasanton, CA

About The Position

Join our Data Platform team at 10x Genomics to architect and implement our strategic Unified Data Platform (UDP). This pivotal role is focused on modernizing our data infrastructure, transitioning to a scalable Event-Driven Architecture (EDA), and building the foundation for next-generation AI/ML and self-service analytics. Lead the architecture and delivery of the Single Source of Truth (SSOT) for the 10x Intelligent Data Ecosystem. Apply advanced software engineering practices to data systems for scalability and reliability. Provide hands-on Senior Data Engineering leadership in developing scalable and maintainable ETL/ELT solutions and data systems in a cloud-native environment. Drive the systematic reduction of critical technical debt by retiring fragile legacy middleware (e.g., Boomi). Partner with engineering and business teams to enable advanced AI capabilities and democratize data access via Natural Language Querying (NLQ).

Requirements

  • Bachelor’s degree in Computer Science, Information Management, or a related field, or equivalent experience.
  • 5+ years of hands-on experience in software engineering focused on data platform development, distributed systems, or enterprise integrations.
  • Proven experience designing and implementing highly scalable data platforms on major cloud environments (e.g., AWS, GCP, or Azure).
  • Deep proficiency in one or more general-purpose programming languages (e.g., Python, Java, or similar).
  • Strong foundation in computer science fundamentals, including data structures, algorithms, and system design.

Nice To Haves

  • Expertise in message queues and event streaming platforms (e.g., Kafka, RabbitMQ, Pub/Sub) and implementing Event-Driven Architecture.
  • Experience with building data lakes/lakehouses using open formats like Apache Iceberg on cloud storage (e.g., Amazon S3).
  • Expertise in modern ELT development, data modeling for OLAP/data warehousing using tools like dbt, and advanced Snowflake features (e.g., Snowpipes, Streams, Stored Procedures).
  • Familiarity with containerization (Docker, Kubernetes) and Infrastructure-as-Code (IaC) principles.
  • Prior experience in migrating an organization off a traditional iPaaS platform or eliminating legacy middleware.
  • Experience with Generative AI integration for data access (e.g., NLQ, feature stores).

Responsibilities

  • Architect and implement the canonical data layer and Event-Driven Architecture (EDA) using technologies like Apache Iceberg and Kafka to decouple applications and ensure real-time data flow.
  • Design, build, and optimize high-volume, code-first data pipelines (real-time and batch) across a large application landscape (e.g., Salesforce, Oracle, Workday).
  • Establish Amazon S3 as the Single Source of Truth (SSOT) and govern data using principles like the Medallion Architecture (Silver and Gold layers) and schema evolution.
  • Develop, test, and maintain robust and scalable ELT pipelines and data models in Snowflake, including leveraging advanced features like Snowpipes, Streams, and Stored Procedures.
  • Develop the data presentation layer for self-service analytics, including the Natural Language Query (NLQ) interface integrated with Generative AI (e.g., Bedrock).
  • Lead technical efforts to migrate key business domains off legacy middleware and onto the new platform, eliminating the "Integration Bottleneck".
  • Define and enforce data governance, quality, and security standards across the Unified Data Platform.
  • Collaborate with the Architecture Review Board (ARB) to promote modern approaches such as serverless computing and Domain-Driven Design.
  • Take ownership of the full development lifecycle, from prototyping and design through deployment, monitoring, and operational excellence.

Benefits

  • equity grants
  • comprehensive health and retirement benefit programs
  • annual bonus program or sales incentive program
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service