Senior, Software Engineer - Data Pipeline

Torc RoboticsAnn Arbor, MI
5h

About The Position

At Torc, we have always believed that autonomous vehicle technology will transform how we travel, move freight, and do business. A leader in autonomous driving since 2007, Torc has spent over a decade commercializing our solutions with experienced partners. Now a part of the Daimler family, we are focused solely on developing software for automated trucks to transform how the world moves freight. Join us and catapult your career with the company that helped pioneer autonomous technology, and the first AV software company with the vision to partner directly with a truck manufacturer. Meet The Team: Torc’s Data Production team is a core enabler of our AV3.0 strategy. The team builds and operates the high‑scale data pipelines that transform raw multimodal sensor logs into structured, training‑ready datasets for machine learning, simulation, autonomy development, analytics, and system validation. We collaborate across vehicle integration, data collection, data systems, perception, ML, and release engineering to operationalize the full data loop that powers Torc’s autonomous‑driving stack.

Requirements

  • Bachelor's or Master's degree in STEM related field with 5+ years of working experience with cloud technologies & data operations.
  • Experience building or maintaining converters, decoders, or transformation pipelines for sensor‑rich data (e.g., lidar point clouds, camera streams, radar detections).
  • Understanding of multimodal data synchronization, timestamp alignment, and multi‑sensor calibration workflows.
  • Experience with distributed compute frameworks (Ray, Spark, Beam) and cloud‑based platforms like Anyscale and Databricks for large‑scale data‑pipeline execution.
  • Experience with high‑performance computing techniques, including vectorized data processing (NumPy), multithreaded or parallel execution, and GPU‑accelerated compute for optimizing large‑scale sensor‑data workloads.
  • Proficiency in Python, SQL, Shell Scripting.
  • Experience with major cloud providers like AWS, Google Cloud Platform (GCP) or Azure.

Nice To Haves

  • Working experience with design patterns & frameworks development for ML & operational data pipelines in the cloud
  • Familiarity with 3D labeling and CV annotation workflows.
  • Experience optimizing I/O‑heavy workloads, including columnar formats (Parquet, Arrow).
  • Knowledge of orchestration tools (Airflow, Argo, Prefect).
  • Hands‑on experience designing CI/CD automation for data services, including GitHub Actions, Databricks pipelines, and cloud‑native deployment workflows
  • Background in Agile Engineering Practices.

Responsibilities

  • Design and develop high‑performance data converters for multi‑sensor autonomous‑driving data (camera, lidar, radar), ensuring accurate time alignment and robust handling of raw sensor logs.
  • Design, build, and optimize large‑scale ingestion and transformation pipelines (ETL/ELT) capable of processing petabyte‑scale autonomous‑driving sensor data, and automate them for reliable, production‑grade deployment.
  • Work with data formats such as ROS bags, MCAP, and custom binary encodings; establish standards for schema evolution and metadata integrity.
  • Implement automated data validation, quality checks, and lineage tracking to ensure reliability of production datasets.
  • Collaborate closely with ML, annotation, simulation, and perception teams to ensure cross‑team ownership of data products and deliver datasets that are consistent, semantically correct, and ready for downstream consumption.
  • Proactively assess current capabilities to identify areas for improvement proposing solutions that align with core strategy and operation.
  • Operates with broad autonomy, leading complex technical work and driving alignment across team boundaries.
  • Owns key data‑pipeline and converter solutions end‑to‑end, setting direction and building consensus.
  • Provides project leadership and mentors less‑experienced engineers to ensure high‑quality execution.

Benefits

  • A competitive compensation package that includes a bonus component and stock options
  • 100% paid medical, dental, and vision premiums for full-time employees
  • 401K plan with a 6% employer match
  • Flexibility in schedule and generous paid vacation (available immediately after start date)
  • AD+D and Life Insurance
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service