Senior or Staff MLE - Droid Perception (Onboard)

ZiplineSouth San Francisco, CA
Onsite

About The Position

Zipline is the world’s largest and most experienced drone delivery service, on a mission to serve all humans equally by ensuring access to food, medicine and essential goods anytime, anywhere. They design, build, and operate the world’s largest autonomous logistics system, delivering critical supplies quickly and reliably. Zipline operates on four continents, makes a delivery every 30 seconds, and has completed millions of deliveries including blood, vaccines, medical supplies, food, and retail products. Their customers include healthcare systems, governments, retailers, restaurants, and global businesses. The system strengthens supply chains, reduces congestion, and gives people time back, with over 140 million commercial autonomous miles safely flown. Zipline seeks practical problem solvers motivated by building systems with direct, meaningful impact and scaling the future of logistics. They are looking for individuals who sculpt from first principles, enjoy facing adversity, and can achieve the impossible at record-breaking speeds. Zipline is operating the world’s largest autonomous logistics network—delivering critical medical and commercial goods globally with high reliability, precision, and scale. As they expand into increasingly complex, safety-critical environments, the systems behind their autonomy stack must be robust, adaptable, and deeply integrated—especially at the intersection of perception and deployment. This role is for senior and staff perception engineers to join the Droid team, which is responsible for the autonomy powering Zipline’s backyard delivery experience. This team owns the full stack of onboard, offboard, and cloud-side perception systems enabling precise and reliable delivery and package pickup in complex customer backyards. The role involves building real-time 3D perception models that capture geometry, scene semantics, and preference for delivery and pickup. Engineers will develop across the entire perception stack, from optimizing onboard TensorRT engines to building a data flywheel that finds interesting samples from their long-tail of customer deliveries. The position requires close collaboration with the planner team to ensure the right system is built. This is a production-focused role, not research, requiring fast shipping of production-grade systems and finding clever ways to apply state-of-the-art techniques to tangible, high-impact problems.

Requirements

  • At least 5+ years of experience building and deploying deep learning-based perception systems, particularly in 3D geometry, semantic understanding, or mapping from cameras
  • Strong understanding of classical computer vision (e.g. camera calibration, epipolar geometry, structure-from-motion, SGBM stereo) and the ability to blend it with modern ML approaches.
  • Expertise and depth with robotics fundamentals: you should be able to reason about reference frames, matrix math, SE(3) manifolds and probabilistic sensor fusion
  • Hands-on experience training, iterating on, and optimizing CNN and transformer architectures on target hardware: think NVIDIA-Jetson sized compute
  • An engineering mindset focused on outcomes over experimentation—you know how to prioritize what's good enough to ship now and what needs to be architected for scale later.
  • Familiarity with building training, data annotation, and evaluation pipelines—not just models.
  • Comfort working across systems: jumping into data pipelines, training infrastructure, or debugging distributed training issues as needed.

Nice To Haves

  • Experience deploying models in real-world, high-stakes robotics or autonomy applications is a strong plus - a robot will move based on the outputs of your perception system

Responsibilities

  • Implement, train and evaluate real-time 3D perception models that work with two or more cameras across one or more timesteps
  • Run these models onboard a resource-constrained computer, finding ways to optimize and reduce compute and memory footprints
  • Build visualization, introspection and eval tooling to deeply understand model performance both on test datasets as well as “in the wild”
  • Help design and implement data selection pipelines that identify the most valuable data from the field, then help our annotation teams label these faster through the use of prelabeling or pseudo-ground-truthing these samples.
  • Work closely with the droid planner team, building a strong interface between the two subsystems and tracking the right metrics to ensure we’re always hill-climbing towards a better overall system
  • Stay up to date with research in the field, drive experimentation, and help keep Zipline’s modeling stack in lockstep with powerful new paradigms in real-time compute-constrained 3D perception

Benefits

  • equity compensation
  • overtime pay
  • discretionary annual or performance bonuses
  • sales incentives
  • medical insurance
  • dental insurance
  • vision insurance
  • paid time off

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Senior

Education Level

No Education Listed

Number of Employees

501-1,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service