About The Position

Matter is building the AI-native autonomy stack for physical manufacturing in the United States. We operate as a contract manufacturer, deploying software and autonomy in our own factories, which gives us something most AI companies don’t have: a live production environment as a training ground. Our long-term vision is to become the infrastructure layer for American manufacturing, the way AWS became infrastructure for software. We are hiring a Research Scientist to lead the development and deployment of Vision-Language-Action (VLA) models for robotic manipulation in live manufacturing work cells. This is not a lab role. You will train models, close the Sim2Real loop, and deploy them on physical robots running production programs. Matter’s Sim2Real pipeline spans NVIDIA Isaac Sim, physics-accurate virtual builds of our modular assembly equipment, and 100% data collection from real factory operations. You will operate at the center of this flywheel design, improving models with every production run. WHY MATTER Most VLA research is validated in a lab or on a tabletop. At Matter, your models run on a production factory floor, handling real parts for real customers. The feedback loop is immediate and grounded. The training data is yours because the factory is yours. No one else in this space has that combination at the stage we’re at.

Requirements

  • PhD, Graduate or equivalent research depth in robotics, machine learning, or a related field
  • Hands-on experience training and deploying VLA, VLM, or generalist robot policies on physical hardware (not just simulation)
  • Strong foundation in imitation learning, reinforcement learning, and general machine learning methods
  • Proficiency in PyTorch; experience with NVIDIA Isaac Sim, MuJoCo, or similar physics engines
  • Ability to debug the full stack: model architecture, training data quality, sim calibration, sensor noise, and hardware edge cases
  • Comfort operating in a high-velocity, ambiguous environment where you own systems end-to-end

Nice To Haves

  • Experience with MARL or multi-robot coordination
  • Background in manufacturing, industrial automation, or robotic assembly

Responsibilities

  • Develop and fine-tune VLA models for precision assembly tasks, including dexterous manipulation, part handling, and test operations
  • Design and manage the Sim2Real training pipeline: domain randomization, synthetic data generation, physics simulation (NVIDIA Isaac Sim, MuJoCo), and sim-to-physical transfer
  • Build evaluation frameworks to benchmark real-world manipulation performance against manufacturing tolerances and repeatability requirements
  • Collaborate with controls and automation engineers to fuse learned policies with traditional control architectures for production safety
  • Contribute to the Physical AI architecture decisions: model selection, data strategy, training infrastructure, and deployment protocols
  • Publish novel research in top tier 1 conferences — though shipping production systems is the primary measure of success

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

Ph.D. or professional degree

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service