Data Engineer - Power

Gecko RoboticsBoston, MA
12dOnsite

About The Position

Gecko Robotics is helping the world’s most important organizations ensure the availability, reliability, and sustainability of critical infrastructure. Gecko's complete and connected solutions combine wall-climbing robots, industry-leading sensors, and an AI-powered data platform to provide customers with a unique window into the current and future health of their physical assets. This enables real-time decision making to increase the efficiency and safety of operations, promote mission readiness, and protect the environment and civilization from the effects of infrastructure failure. As a Data Engineer, you will build and evolve the data backbone of an AI-first product spanning document intelligence, time-series IoT data, and agentic AI systems. This is a highly hands-on, end-to-end role for someone who thrives in early-stage ambiguity, is comfortable working close to models and customers, and wants to directly shape how data and AI are used in production. You will design, implement, and operate data systems across the full lifecycle—from raw ingestion to AI-driven outputs used by customers in the real world. You’ll work directly with customers and internal stakeholders to understand real problems, translate them into technical solutions, and iterate quickly. You’ll build pipelines that support document processing, sensor data, and ML workflows, contribute to feature engineering and model experimentation when needed, and own systems in production. You’ll make pragmatic architectural decisions, improve reliability over time, and help define best practices as the team and product scale.

Requirements

  • 7+ years of experience in data engineering, backend engineering, or adjacent roles
  • Strong Python skills, proficient with ML packages and distributed backends
  • Experience building production data pipelines and systems from scratch
  • Comfort working with both structured and unstructured data
  • Experience operating systems in production and owning reliability
  • Ability to work directly with customers or end users to understand requirements
  • Strong problem-solving skills in ambiguous, fast-moving environments

Nice To Haves

  • Experience contributing to ML workflows (feature engineering, training pipelines, evaluation, or inference)
  • Experience with agentic AI systems or similar orchestration frameworks for enterprise-grade reasoning and/or automation
  • Experience with document processing, NLP, or vector search
  • Experience with time-series or IoT data
  • Startup or early-stage product experience
  • Experience making architectural tradeoffs under real-world constraints

Responsibilities

  • Design, implement, and operate data systems across the full lifecycle—from raw ingestion to AI-driven outputs used by customers in the real world.
  • Work directly with customers and internal stakeholders to understand real problems, translate them into technical solutions, and iterate quickly.
  • Build pipelines that support document processing, sensor data, and ML workflows, contribute to feature engineering and model experimentation when needed, and own systems in production.
  • Make pragmatic architectural decisions, improve reliability over time, and help define best practices as the team and product scale.

Benefits

  • company equity
  • 401(k) matching
  • gender-neutral parental leave
  • full medical, dental, and vision insurance
  • mental health and wellness support
  • ongoing professional development
  • family planning assistance
  • flexible paid time off

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

251-500 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service