Robotics/Perception Summer Intern

mossSan Francisco, CA
1dOnsite

About The Position

Join us as a summer intern! We’re a small, fast-moving team of 5, developing novel perception systems and algorithms to interpret the physical world in challenging, real-world environments. As a summer intern, you’ll help build and ship our 3D perception and mapping pipeline end-to-end. You’ll develop sensor-fusion + 3D mapping from LiDAR, cameras, GPS to turn raw data into real-time understanding of farms. You’ll also help make the system work reliably outdoors. Dealing with motion, lighting, shadows, dust, and occlusions and optimize it for real-time performance and memory on edge hardware. Projects can range from core perception/mapping algorithms and multimodal modeling to integration, testing, benchmarking, and deployment on robots in the field. We move quickly, solve hard problems, and are excited by the reactions we receive from our customers. We're looking for both full time (in-person) roles and intern candidates for co-ops, summer, and/or part-time roles.

Requirements

  • Impressive real-world projects beyond the classroom (robotics, mapping, autonomy, perception etc.)
  • Hands-on experience with 3D sensor data (LiDAR, radar, depth cameras)
  • Advanced proficiency in C++ (C++ or Rust is your primary language)
  • Proficient in Linux (daily driver), including bash, systemd, and networking fundamentals
  • Experience training ML models on custom datasets (data curation, labeling, training/eval loops)
  • Experience with object detection, semantic segmentation, and/or classical CV / point-cloud methods (e.g., clustering, registration, tracking)

Responsibilities

  • Own the full engineering lifecycle, from research and design to prototype, production, deployment, and iteration on real robots
  • Build and ship our 3D perception + mapping stack, turning multi-sensor data into robust, scalable, production outputs
  • Develop and evolve sensor fusion across LiDAR, cameras, GPS, and environmental signals for 3D detection, analysis, and mapping
  • Make it work in the real world: lighting shifts, harsh shadows, motion blur, dust, and severe occlusions
  • Partner closely with hardware to prototype and integrate new sensors and form factors (thermal, IR, soil sensors, and more)
  • Drive performance and reliability: data/ML infrastructure (labeling, datasets, training, eval) plus real-time optimization on edge hardware (latency, memory, parallelization, CUDA)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service