Lead Validation Engineer

AIMSeattle, WA
1d

About The Position

AIM builds autonomy for the real world - robots that move mountains. Our systems integrate software, electronics, mechanical systems, perception, and mission-critical infrastructure into rugged, safety-critical machines operating on construction and mining sites globally. Our machines operate in dynamic, unpredictable environments where performance, reliability, and safety must hold under real-world conditions - not just in simulation. Delivering autonomy at scale requires not only building systems, but proving they work consistently, safely, and reliably in the field. Validation is how we make that real. About You You are a systems-oriented engineer who cares deeply about whether things actually work - not just whether they were designed to work. You: Think in terms of system behavior, not isolated components Build mechanisms that scale, not manual processes Use data and real-world evidence to drive decisions Have strong judgment in ambiguous situations with incomplete information Hold a high bar for reliability, performance, and safety You are equally comfortable: Designing validation strategies and frameworks Building infrastructure and tooling Debugging failures across hardware, software, and AI systems Working in the field to understand real-world conditions You take ownership of outcomes - ensuring validation directly improves system performance, safety, and reliability. About Us Together We’re building autonomous systems that must perform under real-world conditions - and we validate them by: Ensuring end-to-end system behavior across electrical, mechanical, software, and AI subsystems Validating performance under harsh environments: dust, vibration, temperature extremes, and dynamic terrain Ensuring reliability across long-duration operation in real jobsites Testing complex interactions across autonomy, controls, perception, and hardware systems Capturing real-world edge cases and converting them into repeatable validation scenarios We move fast, test rigorously, and build validation systems that scale with the fleet. Why this role exists Autonomous systems fail at system boundaries - in the interactions between perception, planning, controls, hardware, and real-world environments. While engineering teams validate their own components, the Lead Validation Engineer ensures that the integrated system actually works in reality. This role exists to: Close the gap between engineering intent and real-world behavior Build the validation system that scales with the company Provide an independent signal of system readiness This is not a QA role. This role builds the system that makes quality measurable, enforceable, and scalable across AIM. What You Will Own As the Lead Validation Engineer, you will own the system that ensures AIM’s autonomous machines work reliably, safely, and predictably in the real world. You are accountable not just for validation activities, but for making validation scalable, measurable, and deeply integrated into how we build and deploy systems.

Requirements

  • Bachelor’s degree in Mechanical, Electrical, Software, Systems, Robotics, or related field (or equivalent experience)
  • Experience in system-level validation, test engineering, or reliability engineering
  • Experience validating complex systems integrating hardware, software, and autonomy
  • Strong experience designing validation strategies and frameworks
  • Hands-on experience debugging system-level failures

Nice To Haves

  • Experience in robotics, autonomy, or safety-critical systems
  • Experience building validation infrastructure, simulation, or replay systems
  • Experience with functional safety validation
  • Experience scaling validation from prototype to production
  • Experience with real-world field deployments

Responsibilities

  • Own how validation is performed across AIM.
  • Define the end-to-end validation strategy across: simulation software-in-the-loop (SIL) hardware-in-the-loop (HIL) proving ground (PG) field deployments
  • Establish the test pyramid, shifting validation earlier in development
  • Define validation coverage across: system behaviors environmental conditions Operational Design Domain (ODD)
  • Ensure validation scales through automation, replay systems, and data-driven testing
  • Build systems that enable engineers to validate their own work.
  • Develop: automated test frameworks simulation and replay systems data capture and labeling pipelines validation dashboards and reporting systems
  • Integrate validation into CI/CD and development workflows
  • Enable self-service validation across AI, software, and hardware teams
  • Ensure the integrated system behaves correctly across domains.
  • Validate interactions across: perception planning controls hardware systems operator interfaces
  • Identify emergent failure modes not visible at the component level
  • Design validation scenarios that reflect real-world complexity
  • Ensure validation reflects actual operating conditions.
  • Define validation strategies for PG and field deployments
  • Build systems to: capture real-world data reproduce field failures convert issues into repeatable tests
  • Ensure field learnings feed back into validation systems
  • Define and enforce system readiness.
  • Define validation gates for: feature releases system releases customer deployments
  • Provide validation input into ORRs and release decisions
  • Ensure systems meet performance, reliability, and safety thresholds
  • Acts as an independent signal of readiness, not a bottleneck.
  • Make validation measurable.
  • Define metrics including: coverage across ODD conditions failure and escape rates MTBI (Mean Time Between Interventions) validation effectiveness
  • Build dashboards and reporting systems
  • Identify and close validation gaps
  • Ensure failures improve the system.
  • Partner on root cause analysis (CoE)
  • Identify validation gaps from incidents and near misses
  • Feed learnings back into validation frameworks and test coverage
  • Drive validation across engineering teams.
  • Define validation expectations across AI, software, and hardware
  • Ensure teams own validation of their components
  • Drive alignment on validation practices and standards
  • Establish validation as a core engineering discipline.
  • Define validation standards and best practices
  • Establish validation reviews and processes
  • Train teams on validation methodologies
  • Ensure validation is treated as an engineering output
  • Serve as the technical authority on validation.
  • Provide independent input on system readiness
  • Gate releases when validation is insufficient
  • Escalate validation risks to leadership
  • Challenge assumptions where system behavior is not proven
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service