About The Position

Uplight is creating a new category of energy by developing software that manages energy resources in homes and businesses, including smart thermostats, electric vehicles, solar panels, storage batteries, heat pumps, and human behavior. This software generates, shifts, or saves energy to balance the grid, making it more efficient and reliable, and creating clean energy capacity that can be used instead of burning fossil fuels. Uplight's solutions accelerate the transition to clean energy and save money for energy customers. The company is seeking a Data Operations Engineer to join their analytics engineering team to contribute to these ambitious goals for the business and the planet.

Requirements

  • A minimum of 3 years of professional experience developing in a modern programming language (Python preferred)
  • Solid knowledge of ETL and data integration
  • A value for testing and developing quality software
  • Strong critical thinking skills and a desire to work with ambiguous challenges
  • Experience working in an Agile environment and a strong understanding of the full SDLC
  • Strong troubleshooting skills that span the full-stack (front-end clients, APIs, networking, DNS, Linux, containers, databases, distributed systems, etc.)
  • Experience deploying production applications on at least one major cloud provider (AWS, GCP, Azure)
  • Experience writing and maintaining data pipelines and ETLs leveraging Spark or similar tooling
  • Bachelor's degree in Computer Science, Engineering, Data Science, or a related field.

Nice To Haves

  • Experience in the utility industry
  • Experience working cross-functionally with design, product, customer success, sales, etc.
  • Deep technical knowledge of Python, AWS/GCP, Docker, and/or PostgreSQL.

Responsibilities

  • Work as an Engineer on our analytics engineering team, primarily developing in Python and leveraging a wide range of technologies, notably: AWS and GCP, Docker, Apache Airflow, Apache Spark, and PostgreSQL
  • Take problems from inception all the way to completion - own the building, testing, deployment, and maintenance of the code that you work on
  • Tackle complex problems that span a wide range of technical abilities, including: Developing data pipelines to transform and process data between systems, Productionize machine learning pipelines leveraging billions of rows of data, Scaling our software to handle the ever-growing customer data
  • Implement monitoring, alerting, and logging systems for data pipelines
  • Automate routine data operations tasks and optimize workflows for scalability and efficiency
  • Work effectively on an Agile team and collaborate with data engineering, analytics, and DevOps teams to support data infrastructure.
  • Build robust and well documented processes to facilitate data triage and associated fixes
  • Participate in on-call rotation and incident response related to data system outages or failures
  • Save the planet

Benefits

  • Ample advancement opportunities
  • Robust learning and development programs
  • Supportive team environment that fosters collaboration and innovation
  • Comprehensive benefits
  • Flexible time off
  • Generous parental leave
  • Wellness stipend
  • Work flexibility
  • Employee Resource Groups
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service