About The Position

Amazon's Industrial Robotics Group is seeking exceptional talent to help develop the next generation of advanced robotics systems that will transform automation at Amazon's scale. We're building revolutionary robotic systems that combine innovative AI, sophisticated control systems, and advanced mechanical design to create adaptable automation solutions capable of working safely alongside humans in dynamic environments. This is a unique opportunity to shape the future of robotics and automation at unprecedented scale, working with world-class teams pushing the boundaries of what's possible in robotic manipulation, locomotion, and human-robot interaction. This role presents an opportunity to shape the future of robotics through innovative applications of deep learning and large language models. At Industrial Robotics Group we leverage advanced robotics, machine learning, and artificial intelligence to solve complex operational challenges at unprecedented scale. Our fleet of robots operates across hundreds of facilities worldwide, working in sophisticated coordination to fulfill our mission of customer excellence. We are pioneering the development of robotics foundation models that: Enable unprecedented generalization across diverse tasks Enable unprecedented robustness and reliability, industry-ready Integrate multi-modal learning capabilities (visual, tactile, linguistic) Accelerate skill acquisition through demonstration learning Enhance robotic perception and environmental understanding Streamline development processes through reusable capabilities The ideal candidate will contribute to research that bridges the gap between theoretical advancement and practical implementation in robotics. You will be part of a team that's revolutionizing how robots learn, adapt, and interact with their environment. Join us in building the next generation of intelligent robotics systems that will transform the future of automation and human-robot collaboration. As an Applied Science Manager in the Foundation Model team, you will build and lead a team that develops and improves machine learning systems that help robots perceive, reason, and act in real-world environments. You will set the technical direction for leveraging state-of-the-art models (open source and internal research), evaluating them on representative tasks, and adapting/optimizing them to meet robustness, safety, and performance needs. You will drive the capability roadmap and the evaluation strategy that defines “what the robot brain can do,” and you will sponsor targeted innovation when gaps remain. You’ll collaborate closely with research, controls, hardware, and product teams, and ensure the team’s outputs can be further customized and deployed by downstream teams on specific robot embodiments.

Requirements

  • 3+ years of scientists or machine learning engineers management experience
  • PhD (or equivalent) plus 6+ years of applied ML experience.
  • Deep expertise in modern ML for robotics, such as multimodal foundation models (LLM/VLM/VLA), visuomotor policies, video/world-model approaches, imitation learning and/or reinforcement learning
  • Experience setting technical direction and delivering end-to-end results (problem framing → execution → measurable impact), including rigorous evaluation
  • Software engineering judgment; experience building reliable experimentation/ training/ evaluation systems (Python + one of C++/Java)
  • Cross-functional leadership and communication skills in a multi-disciplinary robotics setting
  • Experience transitioning research into deployed systems (reliability, safety/robustness evidence, inference efficiency/latency constraints)
  • Experience with sim↔real/real↔sim workflows and real-robot evaluation at scale
  • Experience scaling training and data pipelines (distributed training, large-scale dataset curation, evaluation automation)

Responsibilities

  • Build and lead a team responsible for the best foundation models (visuomotor / VLA / worldmodel-action policies), and grow capability through hiring, coaching, and bar-raising.
  • Own the technical roadmap and portfolio strategy: proactively track SOTA (open-source + internal research), decide what to adopt, and drive targeted innovation where gaps persist;
  • Establish the capability control plane: define evaluation strategy, benchmarks, scorecards, and regression practices that profile what the robot FMs can do across sim + real and guide investment decisions.
  • Drive embodiment readiness for FMs: ensure models can be adapted/optimized for target embodiments (interfaces, latency/throughput, robustness, safety constraints) and that outputs are consumable by downstream teams for robot-specific finetuning and deployment.
  • Lead the data & training strategy: set standards for data governance/provenance/quality, define data needs for closing key gaps, and ensure efficient training/fine-tuning pipelines and experimentation velocity.
  • Partner across the org: collaborate with research teams (to transition new methods), and with controls/WBC, hardware, and product teams (to align interfaces, constraints, milestones, and integration plans).
  • Communicate and deliver: produce clear technical narratives (roadmaps, design docs, evaluation readouts), manage execution toward milestones, and ensure high-quality handoffs.

Benefits

  • health insurance (medical, dental, vision, prescription, Basic Life & AD&D insurance and option for Supplemental life plans, EAP, Mental Health Support, Medical Advice Line, Flexible Spending Accounts, Adoption and Surrogacy Reimbursement coverage)
  • 401(k) matching
  • paid time off
  • parental leave

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Manager

Education Level

Ph.D. or professional degree

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service