Nuro leverages many different bench-top systems to evaluate and regression test different aspects of the software and hardware integration layer. This performance simulation platform includes systems for creating simulated latency profiles on robot-native compute platforms, camera ISP tuning, and much more. In order to efficiently and effectively manage these systems, we’ve built and integrated infrastructure to support automated autonomy benchmarking workflows and data post processing. You will be responsible for development and integration of this hybrid cloud performance benchmarking cluster, which is a necessary cornerstone of testing for all autonomy feature development. Engineers use these systems to answer questions like: How will my new ML model affect contention on the GPU? How does this change affect our sensor-to-actuation E2E latency? How does this impact onboard logging rate as more data might be flowing from Perception to Behavior? Our team is growing and we’re looking for an engineer to help with development and technical roadmap.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Mid Level
Education Level
No Education Listed
Number of Employees
501-1,000 employees