Senior Data Engineer

Arcadia
2h$131,250 - $235,000Remote

About The Position

Arcadia is the global utility data and energy solutions platform. With our leading data platform, AI-powered analytics, industry expertise, and expansive partner network, we deliver solutions for every stage of the enterprise energy management lifecycle across carbon, cost, and reliability. Arcadia’s Enterprise Energy Management Solutions are built on a foundational data platform that has been developed for over a decade and scaled across millions of customer facilities. We transform fragmented data and siloed processes into coordinated, enterprise-wide action with comprehensive solutions, including: Utility Bill Management: Lower utility costs and streamline bill management with automated bill payment, proactive error identification, optimized tariff structures, and budgeting & forecasting. Energy Procurement Advisory: Source clean energy through a comprehensive evaluation of supply options - including traditional retail options and onsite and offsite resources — to effectively manage risks, reduce costs, and achieve corporate sustainability goals. Sustainability Reporting: Achieve compliance goals and track carbon emissions with standardized energy data and seamless integration with leading sustainability platforms. Tackling an enterprise client’s most critical energy challenges requires out-of-the-box thinking & diverse perspectives. We’re building a team of individuals from different backgrounds, industries, & educational experiences. If you share our passion for ushering in the era of the clean, cost-effective electrons, we look forward to learning what you would uniquely bring to Arcadia! We are seeking a Senior Data Engineer to create clean, curated, and organized data sets used across Arcadia. The role will be hands-on and collaborative, building data assets that enable analysis and drive daily business decisions. You’ll bring an engineer’s rigor and a seasoned data modeler’s expertise to our thorniest data modeling problems, helping to lead our efforts to organize, rationalize, and produce enriched data products on our data. Your expertise will also be essential in optimizing data workflows, ensuring data integrity, and scaling our data infrastructure to support our company's growth. Our core data stack makes heavy use of Snowflake, dbt, and Fivetran for ingestion, all orchestrated within our broader AWS-based ecosystem. You’ll work primarily with SQL in dbt to build our curated data layer–we call it the Unified Data Model–and produce analytics pipelines for our derived data products–our Solutions business. We use dbt Core, orchestrated in Argo or Prefect.

Requirements

  • 4+ years as a Senior Data Engineer or Software Engineer building production data infrastructure. dbt experience is highly desirable
  • 6+ years, cumulatively, in the data space (data engineering, data science, analytics, or similar)
  • Expert-level understanding of conceptual data modeling and data mart design
  • Proficiency in SQL and strong Python skills, especially in the context of data orchestration.
  • Deep experience building data pipelines and database management including Snowflake or similar, along with familiarity with data integration patterns and ELT/ETL processes.
  • Experience with orchestration tools like Prefect, Airflow, or Argo.
  • Proven ability to establish and enforce AI-related engineering best practices (e.g., security, responsible code review, and prompt discipline) within data pipelines and codebase contributions, ensuring data integrity, architectural stability, and the safe use of AI-assisted tools.
  • Ability to bring a customer-oriented and empathetic approach to understanding how data is used to drive the business
  • Experience in technical leadership or mentorship
  • Strong communication and collaboration skills
  • The ability to work East Coast business hours to maximize overlap with team members in India
  • Proven ability to solve complex problems in a dynamic and evolving environment

Nice To Haves

  • Graduate degree in math, statistics, engineering, computer science, or related technical field
  • Experience in predictive modeling and statistical analysis
  • Experience with BI platforms
  • Experience working in global, distributed teams
  • Experience in the energy sector

Responsibilities

  • Design, build and maintain the tooling that the wider Data team (and Arcadia as a whole) uses to interact with our data platform, including CI/CD pipelines for our data lakehouse, and unit/integration/validation testing frameworks for our data pipelines.
  • Optimize and tune data pipelines for improved performance, scalability, and reliability, and proactively monitor pipelines to address any issues or bottlenecks.
  • Collaborate with subject matter experts, engineers, and product managers to identify the most elegant and effective data structures to understand our constantly growing and evolving business
  • Transform, test, deploy, and document data to deliver clean and trustworthy data for analysis to end-users
  • Help bring engineering best practices (reliability, modularity, test coverage, documentation) to our DAG and to our Data team generally
  • Collaborate with data engineers to build robust, tested, scalable and observable ELT pipelines.
  • Identify and implement best practices for data ingestion, transformation, and storage to ensure data integrity and accuracy.
  • Data modeling: model raw data into clean, tested, and reusable datasets to represent our key business data concepts. Define the rules and requirements for the formats and attributes of data
  • Data transformation: build our data lakehouse by transforming raw data into meaningful, useful data elements through joining, filtering, and aggregating source data
  • Data documentation: create and maintain data documentation including data definitions and understandable data descriptions to enable broad-scale understanding of the use of data
  • Employ software engineering best practices to write code and coach analysts and data scientists to do the same

Benefits

  • "Remote first" culture - work anywhere in the US as long as you have a reliable internet connection
  • Flexible PTO - no accrued hours and no limit on the number of vacation days exempt employees can take each year
  • 12 annual company-wide holidays
  • 10 days sick leave
  • Up to 4 weeks bereavement leave
  • 2 volunteer days off
  • 2 professional development days off
  • 12 weeks paid parental leave for all parents
  • 75-95% employer cost coverage for medical, dental, and vision benefits for employees and dependents
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service