We’re building the estimation and navigation stack that keeps our legged and humanoid robots balanced, aware, and mission-ready—indoors and out, with or without GPS. You’ll design and ship real-time estimators and fusion pipelines that combine IMU and GNSS/GPS/RTK with legged-robot proprioception (joint encoders, torque/force & foot-contact sensors) and exteroception (cameras, LiDAR , radar/ UWB ). You’ll take algorithms from log-replay to rugged field performance on embedded/Linux targets, partnering closely with controls, perception, and planning.