Field AI-posted 2 days ago
$70,000 - $300,000/Yr
Full-time • Entry Level
Hybrid • Irvine, CA
11-50 employees

We’re building the estimation and navigation stack that keeps our legged and humanoid robots balanced, aware, and mission-ready—indoors and out, with or without GPS. You’ll design and ship real-time estimators and fusion pipelines that combine IMU and GNSS/GPS/RTK with legged-robot proprioception (joint encoders, torque/force & foot-contact sensors) and exteroception (cameras, LiDAR , radar/ UWB ). You’ll take algorithms from log-replay to rugged field performance on embedded/Linux targets, partnering closely with controls, perception, and planning.

  • State Estimation for Legged/Humanoid Bases
  • Design and tune EKF/UKF error-state filters for floating-base pose/velocity, COM , IMU biases, and contact states.
  • Fuse IMU, joint encoders, foot F/T & contact sensors; implement ZUPT/ZARU , slip handling, and kinematic/dynamic constraints.
  • Expose clean interfaces (frames/timestamps/covariances) to whole-body control and footstep planning.
  • Perception-Aided Localization & Mapping
  • Stand up VIO/LIO pipelines (stereo/ RGB-D + LiDAR) for GPS-denied operation, with map-based relocalization and loop closure.
  • Add global aids— GNSS/RTK , UWB beacons, prior maps—and blend filtering with factor-graph smoothing when advantageous.
  • Manage drift/consistency with robust outlier rejection, gating, and integrity monitoring.
  • Calibration, Timing & Robustness
  • Own time sync ( PTP /Chrony/hardware triggers) and multi-sensor calibration (Allan variance for IMU, camera-IMU/LiDAR-IMU/base extrinsics, encoder offsets).
  • Build health monitoring, FDIR , and graceful-degradation behaviors for harsh terrain and intermittent sensors.
  • Establish KPIs ( ATE/RTE , NEES/NIS , availability) and automated regression tests.
  • Tooling, Simulation & Field Operations
  • Create log-replay pipelines, datasets, and dashboards for rapid iteration and performance tracking.
  • Validate in sim (Gazebo, Isaac etc) and in the field (stairs, rubble, ramps, slippery floors).
  • Optimize and deploy on embedded targets (Jetson/x86), profiling latency, memory, and numerical stability.
  • Strong fundamentals in estimation & sensor fusion ( EKF/UKF , error-state, observability/consistency, covariance tuning).
  • Hands-on with IMUs (strapdown mechanization, bias/scale, coning/sculling) and GNSS/GPS/RTK (loosely vs tightly coupled INS ).
  • Experience with legged-robot proprioception: joint encoders, foot contact/pressure, torque/force sensors; using kinematic/dynamic constraints in estimators.
  • Proficiency in modern C++ (14/17/20) on Linux ; Python for tooling, analysis, and log processing.
  • Comfort with SO(3)/SE(3) , Lie-group math, and non-linear optimization.
  • Integration with at least two of: cameras ( VIO ), LiDAR ( LIO /scan-matching), UWB , magnetometer/barometer, radar.
  • Familiarity with ROS 1/ROS 2 , CMake/Bazel, Docker, CI/CD , and reproducible experiments.
  • Proven track record shipping research-to-production algorithms on real robots with field test cycles.
  • BS/MS/PhD in Robotics/EE/CS/AE or equivalent practical experience.
  • Factor-graph SLAM/VIO (GTSAM/iSAM2) and non-linear solvers (Ceres/g2o); hybrid filtering + smoothing in production.
  • Whole-body/legged tooling, momentum/ COM filters, terrain estimation, and contact-rich datasets.
  • Robustness techniques: adaptive noise models, M-estimators/gating, data association, map-based relocalization.
  • Experience with GPS-denied navigation at scale (warehouses, construction, urban canyons).
  • Real-time/performance chops ( RT-PREEMPT , lock-free pipelines, deterministic logging, on-robot telemetry).
  • Embedded/GPU acceleration ( Nvidia / CUDA ) for perception-aided estimation.
  • Designing calibration & end-of-line test procedures for production.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service