Perception Engineer - Cloud Autonomy

Zipline South San Francisco, CA
120d$180,000 - $240,000

About The Position

Zipline is on a mission to transform the way goods move. Our aim is to solve the world’s most urgent and complex access challenges by building, manufacturing and operating the first instant delivery and logistics system that serves all humans equally, wherever they are. From powering Rwanda’s national blood delivery network and Ghana’s COVID-19 vaccine distribution, to providing on-demand home delivery for Walmart, to enabling healthcare providers to bring care directly to U.S. homes, we are transforming the way things move for businesses, governments and consumers. The technology is complex but the idea is simple: a teleportation service that delivers what you need, when you need it. Through our technology that includes robotics and autonomy, we are decarbonizing delivery, decreasing road congestion, and reducing fossil fuel consumption and air pollution, while providing equitable access to billions of people and building a more resilient global supply chain. Join Zipline and help us to make good on our promise to build an equitable and more resilient global supply chain for billions of people.

Requirements

  • At least 5+ years of experience building and deploying deep learning-based perception systems, particularly in 3D geometry, semantic understanding, or mapping from remote sensing data.
  • Strong understanding of classical computer vision (e.g. camera calibration, epipolar geometry, structure-from-motion) and the ability to blend it with modern ML approaches.
  • Hands-on experience training, iterating on, and optimizing CNN and transformer architectures in production environments.
  • An engineering mindset focused on outcomes over experimentation—you know how to prioritize what's good enough to ship now and what needs to be architected for scale later.
  • Familiarity with building training, data annotation, and evaluation pipelines—not just models.
  • Comfort working across systems: jumping into data pipelines, training infrastructure, or debugging distributed training issues as needed.
  • Experience deploying models in real-world, high-stakes robotics or autonomy applications is a strong plus.

Responsibilities

  • Own the design and implementation of cloud-side autonomy pipelines that directly support and scale our onboard perception stack.
  • Leverage satellite imagery, aerial surveys, and structured data to build semantic and geometric world models of customer delivery zones.
  • Design and ship tools that predict deliverability, generate high-fidelity priors, and reduce the operational friction of onboarding new customers in new environments.
  • Train and deploy mid- to large-scale models for semantic segmentation, 3D geometry, and learned preference modeling.
  • Design evaluation and validation infrastructure to ensure models behave reliably in the field.
  • Work across engineering to integrate your work into fleet-facing autonomy systems.
  • Lead architectural decisions, drive experimentation, and help the team push the limits of what’s possible with production-grade perception at scale.

Benefits

  • Equity compensation
  • Overtime pay
  • Discretionary annual or performance bonuses
  • Sales incentives
  • Medical, dental and vision insurance
  • Paid time off

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Senior

Education Level

Bachelor's degree

Number of Employees

1,001-5,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service