Pony AI-posted 3 months ago
Fremont, CA
1,001-5,000 employees
Publishing Industries

As part of the Perception team, you will help design and build the sensor data pipeline that powers our self-driving vehicles. Our team is responsible for turning raw sensor signals into reliable, real-time information that enables advanced perception models. You'll work across multiple sensing modalities - cameras, lidars, radars, IMUs, microphones, and more - and help ensure that our autonomous driving system can perceive the world with accuracy and robustness. This role is a great fit for engineers excited about robotics, sensor systems, and building the bridge between hardware and AI models.

  • Work on algorithms, tools, and models that extract critical information from multi-modal sensors in real time.
  • Develop and validate systems that ensure sensor data is accurate, synchronized, and reliable, including calibration, error detection, and health monitoring.
  • Integrate sensor data into the perception stack and build efficient data flows that power real-time algorithms.
  • Preprocess multi-sensor inputs to improve perception performance, such as time synchronization and ground detection.
  • Contribute to the overall perception pipeline, from raw sensor integration to AI-ready features.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service