Origis Energy-posted 3 months ago
Full-time • Senior
Austin, TX
251-500 employees

The Senior Data Engineer position, reporting to the Senior Business Systems Manager, will be responsible for design, build, and optimize our data infrastructure with a focus on Databricks, Python, and modern data engineering practices. This role will support data ingestion, transformation, and analytics across operational, financial, and production systems, enabling stakeholders to make informed decisions. Knowledge of JavaScript or front-end integration is a plus for enhancing reporting and user-facing tools.

  • Develop, optimize, and maintain scalable data pipelines and ETL processes in Databricks.
  • Integrate data from diverse sources (plant SCADA systems, IoT devices, financial/ERP systems, market data, etc.).
  • Write efficient, maintainable code in Python for data transformation, automation, and analytics.
  • Design and maintain Delta Lake, SQL-based transformations, and structured streaming workflows.
  • Implement data quality checks, monitoring, and governance to ensure accuracy and reliability.
  • Collaborate with data analysts, business stakeholders, and operations teams to deliver insights.
  • Support Power BI and other visualization platforms by preparing curated datasets.
  • Apply JavaScript to extend visualization capabilities or enhance web-based reporting tools.
  • Ensure solutions are secure, scalable, and aligned with renewable energy industry compliance standards.
  • Bachelor’s or Master’s in Computer Science, Data Engineering, or related field.
  • 3+ years of experience as a Data Engineer (preferably in energy, utilities, or industrial environments).
  • Strong hands-on experience with Databricks (PySpark, Delta Lake, Delta Sharing, MLflow a plus).
  • Proficiency in Python for data engineering and automation tasks.
  • Solid SQL skills for complex queries and data modeling.
  • Familiarity with cloud platforms (AWS, Azure, or GCP), especially in a data context.
  • Familiarity with CI/CD, Git, and DevOps best practices.
  • Experience with CI/CD pipelines, Git, and Terraform for Databricks.
  • Familiarity with dbt, Airflow, or other orchestration tools.
  • Understanding of data privacy, security, and compliance principles (e.g., row-level security, encryption).
  • Exposure to machine learning workflows and MLflow.
  • Exposure to JavaScript or related frameworks (React, D3.js) is a plus.
  • Employer paid health insurance
  • Paid time off
  • 401(k) plan with employer matching contributions
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service