Applied Scientist - Perception (SLAM/VIO), Fauna

AmazonNew York, NY
Onsite

About The Position

We are seeking an Applied Scientist to develop and optimize Visual Inertial Odometry (VIO) and sensor fusion systems for our intelligent robots. In this role, you will design, implement, and deploy state estimation and tracking algorithms that enable robots to understand their position and motion in real time, even in challenging and dynamic environments. You will own the full pipeline from algorithm development through embedded deployment, ensuring that perception systems run efficiently on resource-constrained robotic hardware. You will also leverage modern machine learning approaches to push the boundaries of classical perception methods, combining learned representations with geometric techniques to achieve robust, real-time performance. This is a deeply hands-on role. You will work directly with sensors, hardware, and real-world data, while prototyping, testing, and iterating in physical environments. The ideal candidate has strong foundations in VIO and sensor fusion, practical experience optimizing algorithms for embedded platforms, and familiarity with how modern deep learning is transforming perception.

Requirements

  • PhD, or Master's degree and 3+ years of applied research experience
  • Experience with any programming language such as Python, Java, C++
  • Hands-on experience developing and deploying Visual Inertial Odometry or visual-inertial SLAM systems
  • Strong understanding of multi-sensor fusion (cameras, IMUs, odometry) and state estimation (EKF, factor graphs)
  • Experience optimizing perception algorithms for embedded or resource-constrained hardware
  • Demonstrated hands-on experience with real sensor data, calibration, and physical robot platforms
  • Familiarity with modern ML approaches to perception (learned feature extraction, depth prediction, end-to-end odometry)

Nice To Haves

  • Experience leading technical initiatives and key deliverables
  • Publication record at major robotics or computer vision conferences (e.g., ICRA, IROS, RSS, CVPR, ECCV)
  • Experience with real-time systems programming and performance profiling on ARM/GPU platforms
  • Experience with state estimation on legged robots
  • Experience with stereo vision systems, camera-IMU calibration, time synchronization, and sensor characterization
  • Track record of shipping VIO or SLAM systems to production on physical robots at scale
  • Experience with NVIDIA Jetson, Qualcomm RB5, or similar embedded AI platforms
  • Familiarity with ROS/ROS2
  • Experience integrating learned perception modules (e.g., neural depth, feature matching networks) into geometric estimation pipelines
  • History of technical leadership and cross-functional collaboration

Responsibilities

  • Design and implement Visual Inertial Odometry algorithms for robust real-time state estimation on robotic platforms like Sprout
  • Develop multi-sensor fusion pipelines integrating cameras, IMUs, and other sensing modalities for accurate pose tracking
  • Optimize perception and tracking algorithms for deployment on embedded hardware (e.g., ARM, GPU-accelerated edge devices) under strict latency and power constraints
  • Apply modern ML-based perception techniques (learned features, depth estimation, neural odometry) to complement and improve classical geometric approaches
  • Build and maintain calibration, evaluation, and benchmarking infrastructure for perception systems
  • Collaborate with hardware, controls, and navigation teams to integrate perception outputs into the robot’s autonomy stack
  • Lead technical projects from research prototyping through production deployment

Benefits

  • sign-on payments
  • restricted stock units (RSUs)
  • health insurance (medical, dental, vision, prescription, Basic Life & AD&D insurance and option for Supplemental life plans, EAP, Mental Health Support, Medical Advice Line, Flexible Spending Accounts, Adoption and Surrogacy Reimbursement coverage)
  • 401(k) matching
  • paid time off
  • parental leave
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service