About The Position

We are seeking a self-motivated intern to prototype AI-driven sense–plan–act architectures for autonomous robotic systems in manufacturing plants. You will participate in the development of a camera- and LiDAR-based wheel-drive robot and integrating software with physical and simulation platforms. You will work cross-functionally with autonomy experts on system-level validation. The role also includes evaluating AI-driven methods for localization/map construction, perception, motion planning, scenario simulation, and data engineering, with hands-on experimentation, algorithm development, and multi-modal sensor integration.

Requirements

  • Currently enrolled in a bachelor's program in Robotics, Computer Science, Electrical/Mechanical Engineering, or related technical fields.
  • Proficiency in C++ or Python.
  • Adhere to continuous development and deployment practices in robotic software development

Nice To Haves

  • Machine learning knowledge and practice experience.
  • Proficiency with deep learning frameworks and toolchains like PyTorch and TensorFlow
  • Familiarity with repositories like DETR, BEVformer, BEVfusion, SAMv2, Ceres Library/GTSAM, ORB-SLAM, VINS-Mono, and etc.
  • Experience working with cloud-based data collection and data pipeline systems.
  • AV/ADAS integration or industrial automation experience is a bonus.
  • Graduating between December 2026 and June 2027.
  • Experience or coursework in one or more of the following areas:
  • Camera- and LiDAR-based localization algorithms
  • Statistical estimation theory (e.g., pose graph or factor graph techniques)
  • Place recognition and loop-closure detection
  • Perception tasks such as object detection or semantic representation
  • Motion planning algorithms and platforms (e.g., Nav2)
  • Working with simulation tools (e.g., IsaacSim, IsaacLab)
  • Dataset handling or annotation
  • Software optimization for resource-constrained systems
  • Familiarity with ROS2 or other robotics middleware

Responsibilities

  • Support the design and implementation of high-precision localization methods using camera, LiDAR, wheel encoder and inertial sensors.
  • Assist in developing scalable and real-time localization module optimized for autonomous robotic systems.
  • Help to create engineering specifications and test procedures to ensure system compliance.
  • Evaluate and benchmark the performance of systems.
  • Review the state-of-the-art in camera- and LiDAR-based algorithms.
  • Troubleshoot using strong knowledge of probabilistic estimation, sensor fusion, and real-time system implementation.

Benefits

  • Paid US GM Holidays
  • GM Family First Vehicle Discount Program
  • Result-based potential for growth within GM
  • Intern events to network with company leaders and peers
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service