About The Position

We’re looking for an Autonomy Engineer focused on onboard autonomy—the software that runs on the robot/vehicle/embedded computer and makes real-time decisions using onboard sensors and compute. You’ll build and ship reliable autonomy features that operate under tight latency, compute, and safety constraints in the real world.

Requirements

  • Strong software engineering skills in C++ and/or Rust (Python acceptable as a supporting language).
  • Experience shipping software that runs on-device with real-world constraints (embedded Linux, real-time-ish systems, performance-sensitive code).
  • Understanding of autonomy fundamentals: planning, state estimation/localization, controls, and how they interface (you don’t need to be an expert in all).
  • Experience with robotics middleware and tooling (commonly ROS/ROS 2, custom pub/sub frameworks, gRPC, DDS, etc.).
  • Proficiency with debugging and performance tools (e.g., gdb/lldb, perf, flamegraphs, profiling GPU workloads, log/trace analysis).
  • Strong testing discipline: unit/integration tests, simulation/HIL concepts, and safe rollout practices for autonomy.

Nice To Haves

  • Experience with behavior trees (e.g., BehaviorTree.CPP), hierarchical state machines, or mission/task planning.
  • Practical experience with local planners (trajectory rollout, MPC, sampling-based methods) and real-time control loops.
  • Sensor fusion experience (EKF/UKF), time sync, calibration, and handling intermittent sensors.
  • Experience with mapping and localization stacks (scan matching, visual-inertial odometry, SLAM, map-based localization).
  • Familiarity with safety standards/processes (e.g., ISO 26262 concepts, FMEA, hazard analysis) depending on domain.
  • Experience deploying autonomy to fleets: OTA updates, versioning, configuration management, and field telemetry.
  • Experience in inference optimization

Responsibilities

  • Develop, integrate, and deploy onboard autonomy behaviors (e.g., navigation, obstacle avoidance, lane/route following, docking, interaction behaviors).
  • Implement and maintain real-time decision-making components: behavior planning, state machines/behavior trees, local planning, and control interfaces.
  • Build robust sensor-driven autonomy pipelines on-device (camera, lidar, radar, IMU, wheel odometry, GNSS), including synchronization, calibration hooks, and fault handling.
  • Optimize autonomy performance for latency, CPU/GPU usage, memory, and power on embedded compute (e.g., NVIDIA Jetson, x86 edge boxes, custom ECUs).
  • Design and implement safety and fallback strategies: health monitoring, degraded modes, watchdogs, safe-stop, and redundancy-aware logic.
  • Own the autonomy stack’s on-robot integration: bring-up, debugging, profiling, logging, and release validation on real hardware.
  • Improve onboard observability: structured logs, traces, metrics, event recording, and tools to support incident review and rapid iteration.
  • Collaborate with perception, mapping/localization, controls, hardware, and systems teams to define clear interfaces and ship end-to-end features.
  • Participate in field testing and root-cause analysis of autonomy issues seen in real deployments.

Benefits

  • At Nuro, your base pay is one part of your total compensation package.
  • This position is also eligible for an annual performance bonus, equity, and a competitive benefits package.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service