Senior Perception Engineer

Chef RoboticsSan Francisco, CA
Onsite

About The Position

Chef Robotics is building autonomous robots that work alongside humans in commercial food preparation environments. Perception is at the heart of what makes these robots reliable. As a Perception Engineer, you will own the full stack of how our robots see and understand the world, from integrating cutting-edge camera hardware to training production-grade deep learning models, and ensuring those models perform accurately and efficiently in real-time on the factory floor. You will work on technically rich problems in applied robotics, including dense instance segmentation of deformable food items, real-time inference under tight latency constraints, sensor fusion, and robust tracking in cluttered, dynamic environments. You will not just train models; you will design the pipelines that gather and curate data, define the architectures that balance accuracy and speed, and own the deployment and field troubleshooting of what you build. This is a high-ownership role within a small team. We work onsite five days a week and move with startup urgency. You will be expected to go deep technically while staying pragmatic about what ships.

Requirements

  • BS, MS, or PhD in Computer Science, Robotics, Electrical Engineering, or a closely related field.
  • 5+ years of combined research and industry experience in computer vision and machine learning, with a track record of shipping perception systems to production.
  • Deep expertise in at least two of: instance/semantic segmentation, object detection, 3D perception, or multi-object tracking.
  • Strong Python skills; experience building production-quality, maintainable code — not just research prototypes.
  • Hands-on experience with deep learning frameworks (PyTorch strongly preferred) and the full training pipeline from data to deployed model.
  • Experience working with RGBD sensors, depth cameras, and point cloud data.
  • Proven ability to build and optimize models for low-latency, real-time inference.
  • Familiarity with ROS or similar robotics middleware.

Nice To Haves

  • Experience using simulation environments (e.g. Isaac Sim, Gazebo) for synthetic data generation, domain randomization, and sim-to-real transfer of perception models.
  • C++ proficiency for performance-critical modules and embedded deployment.
  • Experience with cloud ML infrastructure (GCP, AWS) and containerization (Docker, Kubernetes).
  • Background in autonomous vehicles, warehouse robotics, or other perception-heavy robotics applications.
  • Contributions to open-source CV/ML projects or publications in top-tier venues (CVPR, ECCV, NeurIPS, etc.).

Responsibilities

  • Design, train, and optimize deep learning models for detection, segmentation, pose estimation, and classification, with a focus on real-world robustness over benchmark performance.
  • Build low-latency inference pipelines that approach real-time performance; profile and optimize models for deployment on embedded and edge hardware.
  • Develop and improve multi-object tracking algorithms for reliable identification and motion prediction of items across frames.
  • Solve challenging perception problems specific to food robotics: deformable objects, occlusions, varying lighting, and high visual similarity between categories.
  • Own the end-to-end ML lifecycle: data collection strategy, annotation tooling, dataset curation, augmentation pipelines, model training, evaluation, deployment, and field debugging.
  • Develop tooling to monitor model performance in production and drive continuous improvement cycles.
  • Partner closely with robotics, hardware, and software engineers to translate perception capabilities into reliable end-to-end robot behaviors.
  • Help define the perception roadmap and influence technical direction as the team grows.
  • Assist in integrating new cameras and sensors for enhanced robotic vision.

Benefits

  • medical, dental, and vision insurance
  • commuter benefits
  • flexible paid time off (PTO)
  • catered lunch
  • 401(k) matching
  • equity
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service