About The Position

FieldAI’s Irvine team is where embodied AI meets real robots, real sensors, and real field deployments. Based in the heart of Southern California’s robotics ecosystem, we build risk-aware, reliable, field-ready AI systems that solve the hardest problems in robotics and unlock the full potential of embodied intelligence. If you want your work to ship, get tested on hardware, and improve through real deployments, Irvine is the place. We go beyond typical data-driven approaches or pure transformer-only architectures, combining rigorous engineering with learning systems proven in globally deployed solutions that deliver results today and get better every time our robots run in the field. We are looking for a Software Engineer, Verification and Validation (V&V), Perception to join our Autonomy V&V team and help ensure the performance, robustness, and reliability of our robotic perception stack in real-world operating conditions. In this role, you will focus on the independent verification and validation of perception software that is designed and developed by our Perception team, helping provide objective assessment of system readiness, performance, and quality. You will work closely with Perception, Autonomy, Systems, Hardware, and Field Test teams to define validation strategies, execute test campaigns, analyze failures, and communicate results. This role is ideal for someone who enjoys being the technical bridge between development and release quality, bringing rigor, independence, and strong engineering judgment to the evaluation of perception capabilities built on LiDAR, cameras, radars, GPS, and IMUs.

Requirements

  • Bachelor’s or Master’s degree in Robotics, Electrical Engineering, Computer Engineering, Computer Science, Mechanical Engineering, or a related technical field.
  • 3+ years of experience in verification, validation, systems test, or perception evaluation for robotics, autonomous systems, automotive, or similar domains.
  • Experience working with robotic sensors such as LiDAR, cameras, GPS, and IMUs.
  • Strong understanding of perception system behavior, sensor limitations, and common failure modes.
  • Experience developing test plans, validation procedures, performance metrics, and structured test reports.
  • Experience analyzing logs, datasets, and field results to debug issues and perform root-cause analysis.
  • Strong cross-functional communication skills and the ability to work effectively with development teams while representing an independent V&V function.

Nice To Haves

  • Experience validating perception systems for autonomous vehicles, mobile robots, drones, industrial robots, or defense robotics platforms.
  • Familiarity with perception workflows such as detection, tracking, localization, mapping, or sensor fusion.
  • Experience with simulation, software-in-the-loop, hardware-in-the-loop, and replay-based validation.
  • Experience with sensor calibration, synchronization, time alignment, and sensor health monitoring.
  • Experience building automated regression tools or validation infrastructure.
  • Familiarity with annotated datasets, ground-truth generation, and scenario-based test design.
  • Knowledge of structured verification processes, requirements traceability, and safety-oriented development practices.

Responsibilities

  • Independently validate perception software developed by the Perception team
  • Evaluate perception software from a V&V perspective to ensure it meets system and product requirements.
  • Provide objective verification of functionality, robustness, and release readiness.
  • Work closely with developers while maintaining an independent quality and validation role.
  • Define validation strategies for perception systems
  • Create V&V plans for LiDAR, camera, GPS, and IMU-based perception systems across sensor, module, and system levels.
  • Convert product and engineering requirements into clear, testable validation criteria.
  • Establish performance metrics, quality gates, and release readiness criteria.
  • Design and execute perception test coverage
  • Develop test cases for nominal, edge-case, and degraded sensor conditions.
  • Validate performance across simulation, replay, and real-world test environments.
  • Ensure coverage across environmental, operational, and sensor failure scenarios.
  • Analyze failures using logs and datasets
  • Investigate issues using recorded sensor logs, replay tools, and annotated datasets.
  • Perform structured root-cause analysis on perception and sensor-related failures.
  • Partner with the Perception team to reproduce issues, clarify expected behavior, and verify fixes.
  • Build scalable validation and regression workflows
  • Improve repeatable validation through automation, replay, and simulation-based testing.
  • Support regression coverage across software releases and perception updates.
  • Contribute to clear reporting and measurable quality tracking.
  • Drive cross-functional release confidence
  • Collaborate with Perception, Autonomy, Systems, Hardware, and Field teams on validation priorities and readiness.
  • Communicate risks, findings, and recommendations to technical stakeholders.
  • Support data-driven release decisions with clear, independent validation evidence.

Benefits

  • Our salary range is generous and we consider each individual’s background and experience when determining final compensation. Base pay may vary based on role scope, job-related knowledge, skills, experience, and the Irvine, California market.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service