Staff Data Systems Engineer

LightmatterMountain View, CA
$196,000 - $217,000Hybrid

About The Position

Lightmatter is at the forefront of AI data center infrastructure, developing the world's first 3D-stacked photonics engine, Passage™, to enable high-speed connections for advanced AI and HPC workloads. The company recently secured $400 million in Series D funding, valuing it at $4.4 billion, and is expanding its teams to accelerate data center photonics development. This role is for a Senior Data Systems Engineer who will bridge hardware domain expertise with modern data infrastructure. This individual will not only build data pipelines but also collaborate directly with foundries, OSATs, test engineers, and validation teams to understand data meaning, define data flow, and construct the platform for AI-driven decisions across numerous hardware programs. This is a unique data engineering position requiring a background in semiconductor manufacturing, photonics, hardware test/validation, or a related hardware discipline, coupled with a strong interest in data systems, process optimization, and automation. The ideal candidate understands that the core challenge of a data platform lies in identifying and leveraging the most critical data.

Requirements

  • Bachelor’s Degree in Photonics, Data Engineering, Electrical Engineering or a related field
  • 8+ years of professional experience in data engineering, process engineering, test engineering, or a related technical role in semiconductor, photonics, or hardware manufacturing
  • Proficiency in Python and SQL
  • Experience with Snowflake, dbt, Dagster/Airflow, or similar modern data stack components
  • Experience working with data for hardware test, manufacturing, or characterization systems
  • Demonstrated ability to work cross-functionally with technical teams and external partners (vendors, foundries, or OSATs) to define data requirements and influence practices
  • Strong analytical thinking — you can look at a dataset and understand what it means in the physical world, not just how to transform it

Nice To Haves

  • Master’s or PhD in Physics, Electrical Engineering, Photonics, or a related field with 4+ years of experience
  • Background in silicon photonics, photonic integrated circuits (PICs), or optical systems
  • Experience with High Volume Manufacturing (HVM) ramps or new product introduction (NPI) data workflows
  • Familiarity with semiconductor test data formats (STDF, WAT, parametric data)
  • Experience building data quality frameworks or monitoring systems
  • Experience with AI/ML tools applied to manufacturing or engineering data

Responsibilities

  • Engage directly with foundries, OSATs, and internal engineering teams to align on data formats, delivery mechanisms, and quality standards for new programs
  • Own the data onboarding lifecycle for programs ramping to High Volume Manufacturing (HVM)
  • Identify gaps in data coverage across the pipeline: design, foundry, test/validation, photonics characterization, packaging/assembly, and system-level validation
  • Influence external and internal teams to adopt scalable, platform-compatible data practices — this is hands-on stakeholder work, not email coordination
  • Design, build, and maintain data pipelines that ingest, transform, and deliver hardware data from wafer-level test through system validation
  • Build and maintain data models in dbt on Snowflake that serve downstream analytics, reporting, and AI/ML workflows
  • Develop and operate orchestration workflows (Dagster) for reliable, observable data delivery
  • Implement data quality checks, monitoring, and alerting to ensure platform reliability across production programs
  • Build performance metrics pipelines that translate raw hardware data into actionable insights for engineering and program teams
  • Own stakeholder reporting and deliver training for teams onboarding to platform tools (Snowflake, Sigma, custom dashboards)
  • Support AI/ML workflows by ensuring the data foundation is clean, structured, and reliable
  • Contribute to platform infrastructure: CI/CD, RBAC, cost optimization, self-service tooling
  • Participate in sprint ceremonies, contribute to architectural decisions, and help shape the team's technical roadmap
  • Mentor junior team members on both data engineering practices and hardware domain context

Benefits

  • Comprehensive Health Care Plan (Medical, Dental & Vision)
  • Retirement Savings Matching Program
  • Life Insurance (Basic, Voluntary & AD&D)
  • Generous Time Off (Vacation, Sick & Public Holidays)
  • Paid Family Leave
  • Short Term & Long Term Disability
  • Training & Development
  • Commuter Benefits
  • Flexible, hybrid workplace model
  • Equity grants (applicable to full-time employees)
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service