Origis Energy-posted 3 months ago
Full-time • Senior
Hybrid • Austin, TX
251-500 employees
Heavy and Civil Engineering Construction

Join the Origis Energy Team! Origis Energy is accelerating the transition to a carbon-free future by Reimagining ZeroSM. As one of America's leading renewable energy and decarbonization solution platforms, the company continues to expand and reimagine its contribution to the world's net-zero goals. Origis Energy puts customers first to deploy a wide range of sustainable solutions for grid power generation, performance optimization, and long-term operation of solar and energy storage plants across the U.S. Founded in 2008, Origis Energy is headquartered in Miami, FL. The Origis team, regarded as one of the leading developers in the U.S., is committed to living five core values in all interactions both with each other and external stakeholders: Solve for Tomorrow, Rise Together, Perform at Peak, Inspire & Grow, and Be Resilient.

  • Develop, optimize, and maintain scalable data pipelines and ETL processes in Databricks.
  • Integrate data from diverse sources (plant SCADA systems, IoT devices, financial/ERP systems, market data, etc.).
  • Write efficient, maintainable code in Python for data transformation, automation, and analytics.
  • Design and maintain Delta Lake, SQL-based transformations, and structured streaming workflows.
  • Implement data quality checks, monitoring, and governance to ensure accuracy and reliability.
  • Collaborate with data analysts, business stakeholders, and operations teams to deliver insights.
  • Support Power BI and other visualization platforms by preparing curated datasets.
  • Apply JavaScript to extend visualization capabilities or enhance web-based reporting tools.
  • Ensure solutions are secure, scalable, and aligned with renewable energy industry compliance standards.
  • Bachelor's or Master's in Computer Science, Data Engineering, or related field.
  • 3+ years of experience as a Data Engineer (preferably in energy, utilities, or industrial environments).
  • Strong hands-on experience with Databricks (PySpark, Delta Lake, Delta Sharing, MLflow a plus).
  • Proficiency in Python for data engineering and automation tasks.
  • Solid SQL skills for complex queries and data modeling.
  • Familiarity with cloud platforms (AWS, Azure, or GCP), especially in a data context.
  • Familiarity with CI/CD, Git, and DevOps best practices.
  • Experience with CI/CD pipelines, Git, and Terraform for Databricks.
  • Familiarity with dbt, Airflow, or other orchestration tools.
  • Understanding of data privacy, security, and compliance principles (e.g., row-level security, encryption).
  • Exposure to machine learning workflows and MLflow.
  • Exposure to JavaScript or related frameworks (React, D3.js) is a plus.
  • Employer paid health insurance
  • Paid time off
  • 401(k) plan with employer matching contributions
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service