About The Position

As a Computer Vision & Autonomy Engineer, you will be joining the team responsible for the design, development, and implementation of high-speed perception and autonomy stacks capable of identifying and tracking highly dynamic objects. You will solve the unique challenges of high-dynamic sensing, where relative velocities are extreme and the margin for error is zero.

Requirements

  • Master’s or PhD in Robotics, Computer Science, or Aerospace Engineering with a focus on Computer Vision or Autonomous Systems.
  • Expert knowledge of object tracking (KCF, SORT, DeepSORT) and the geometry of moving camera platforms.
  • Proficiency in C++20 and CUDA for high-throughput image processing, and Python for training ML models.
  • Deep understanding of 3D geometry, Kalman Filtering (EKF/UKF), and the physics of relative motion.

Nice To Haves

  • Experience working with Long-Wave Infrared (LWIR) or Mid-Wave Infrared (MWIR) sensors.
  • Experience deploying models on NVIDIA Jetson Orin or FPGA-based vision processing.
  • Proficiency in NVIDIA Isaac Sim, Unreal Engine 5, or Gazebo to generate synthetic data for rare "corner-case" scenarios.
  • Understanding of how perception latency affects the stability of flight control loops.

Responsibilities

  • Develop robust real-time Deep Learning and Classical CV algorithms for classification, and tracking (e.g., YOLO, Transformer-based architectures) of highly dynamic objects.
  • Implement Visual-Inertial Odometry (VIO) and filtering techniques to estimate target 3D trajectories and "Time-to-Go" under high-G maneuvers.
  • Create "GPS-denied" navigation solutions and anti-jamming vision pipelines that maintain autonomy when external signals are compromised.
  • Design "Vision-Based Pursuit" laws and Proportional Navigation (PN) enhancements that translate visual target states into actionable steering commands.
  • Optimize algorithms for ultra-low latency execution on low-power devices, ensuring the "sensor-to-actuator" delay is minimized.
  • Profile and eliminate "long-tail" latency spikes in the autonomy stack to ensure a deterministic sensor-to-actuator response time.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service