Data Engineer ID56348

AgileEngineDowney, CA
Hybrid

About The Position

We are looking for a Platform Automation & Data Engineer to design and maintain data ingestion pipelines and automation frameworks that power cloud cost transparency and optimization for Indeed’s Cloud Economics team. You will work across Python, SQL, dbt, Airflow, AWS, and Snowflake to build scalable ELT/ETL solutions and telemetry-driven cost attribution systems. The role serves Product, Engineering, Finance, and Analytics teams with reliable, data-informed decision-making infrastructure.

Requirements

  • 4+ years of experience in a Data Engineering role;
  • Strong hands-on experience with SQL and Python;
  • Experience with workflow orchestration tools such as dbt, Airflow, or Dagster;
  • Demonstrated experience designing and maintaining end-to-end ELT/ETL data pipelines;
  • Strong background in data modeling and schema design;
  • Experience working with AWS services, including Glue, Aurora, and Athena, and Snowflake;
  • Familiarity with CI/CD pipelines and Infrastructure as Code such as Terraform;
  • Experience building and consuming REST APIs;
  • Demonstrated ability to design and optimize cost-efficient data solutions using techniques such as query optimization, indexing strategies, and query plan analysis;
  • Strong analytical skills with exceptional attention to detail, including data auditing and validation;
  • Upper-intermediate English level.

Nice To Haves

  • FinOps Practitioner or FinOps Engineer certification;
  • Experience with Scala;
  • Hands-on experience with cloud platforms such as AWS and GCP, and observability tools such as Datadog;
  • Experience working with large-scale datasets and distributed systems;
  • Strong understanding of cloud cost optimization strategies;
  • Experience in highly data-driven, cross-functional environments.

Responsibilities

  • Design, develop, and implement tooling and datasets that enable telemetry-driven cost attribution and performance-informed financial modeling;
  • Build and maintain scalable data pipelines and automation frameworks to support cost transparency and optimization initiatives;
  • Develop systems that surface cost-saving opportunities and support committed-use discount modeling and management;
  • Ensure data accuracy and reliability through end-to-end validation, auditing, and observability practices;
  • Collaborate cross-functionally with Product, Engineering, Finance, and Analytics stakeholders to deliver high-impact data solutions;
  • Improve and maintain the long-term scalability, performance, and reliability of ELT/ETL pipelines;
  • Support and enhance CI/CD processes and infrastructure automation.

Benefits

  • Mentorship
  • TechTalks
  • Personalized growth roadmaps
  • Competitive compensation
  • USD-based pay
  • Education budget
  • Fitness budget
  • Team activity budgets
  • Modern solutions with Fortune 500 and top product companies
  • Flexible schedule
  • Remote options
  • Office options
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service