Sr Data Infrastructure Engineer

CobotSanta Clara, CA
88d$180,000 - $215,000

About The Position

Cobot is seeking a Senior Data Infrastructure Engineer to design data systems that power real-time insights for next-generation robotics products. This role is ideal for engineers who thrive in fast-paced environments and enjoy working at the intersection of streaming systems, analytics platforms, and mission-critical operations. You will collaborate across robotics, AI, and product teams to architect and scale the backbone that powers Vista (our AI insights layer) and ScoutMap (our 3D mapping and environment intelligence system). Your work will ensure our customers can ask 'what’s happening now?' and always get fast, reliable answers.

Requirements

  • 5+ years of professional experience in data engineering or data infrastructure roles
  • Strong proficiency in Python and SQL, with the ability to write production-quality, scalable, and well-tested code.
  • Proven experience designing and operating ingestion pipelines and staging layers (streaming and batch) that support downstream transformations.
  • Experience deploying and managing cloud data infrastructure in AWS using infrastructure-as-code (e.g., Terraform, Kubernetes, Docker).
  • Hands-on experience with cloud-based data platforms, storage systems, and infrastructure.
  • Familiarity with data quality practices, testing frameworks, and CI/CD for data pipelines.
  • Highly motivated teammate with excellent oral and written communication skills.
  • Enjoy working in a fast paced, collaborative and dynamic start-up environment as part of a small team.
  • Willingness to travel occasionally for on-site support or testing, as needed.
  • Must have and maintain US work authorization.

Nice To Haves

  • Proven experience as the technical lead or primary owner of a data pipeline or platform project.
  • Experience with Databricks (Delta Live Tables, SQL Warehouse) and familiarity with dbt or similar tools to support analytics engineers.
  • Strong understanding of multi-tenant architectures, with a track record of balancing cost-efficiency, performance, and reliability at scale.
  • Background in streaming systems (Kafka, Flink, Kinesis, or Spark Structured Streaming).
  • Familiarity with data quality and observability tools (e.g., Great Expectations, Monte Carlo).
  • Exposure to IoT/robotics telemetry or 3D/spatial data processing (e.g., point clouds, LiDAR, time-series).
  • Experience working in a product-facing data role, collaborating directly with product, engineering, and AI/ML teams to deliver data systems that enable new customer-facing features.
  • Demonstrated ability to translate product requirements into scalable data solutions with measurable business impact.

Responsibilities

  • Own the full ingestion path from edge to cloud, ensuring robot telemetry, sensor data, and warehouse events are reliably captured, transported, and made available for downstream systems.
  • Design, build, and operate scalable pipelines and foundational data layers (streaming and batch) that deliver low-latency, reliable data for analytics, AI/ML, and product features.
  • Implement observability, monitoring, and CI/CD practices to ensure pipeline quality and keep data flows robust, maintainable, and trustworthy.
  • Scale and optimize multi-tenant infrastructure, balancing performance, reliability, and cost-efficiency as Cobot’s customer base grows.
  • Collaborate directly with robotics, AI/ML, and product teams to translate product requirements into resilient data systems that unlock features in Vista, Portal, and Scoutmap.
  • Establish and enforce best practices for data engineering, reliability, and security at Cobot.

Benefits

  • Base salary range of $180,000 - $215,000 plus equity and comprehensive benefits.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service