Data Engineer I

The Walt Disney CompanySanta Monica, CA
Onsite

About The Position

Technology is at the heart of Disney’s past, present, and future. Disney Entertainment and ESPN Product & Technology is a global organization of engineers, product developers, designers, technologists, data scientists, and more – all working to build and advance the technological backbone for Disney’s media business globally. The team marries technology with creativity to build world-class products, enhance storytelling, and drive velocity, innovation, and scalability for our businesses. We are Storytellers and Innovators. Creators and Builders. Entertainers and Engineers. We work with every part of The Walt Disney Company’s media portfolio to advance the technological foundation and consumer media touch points serving millions of people around the world. The Acquisition Marketing Engineering team owns the ingestion, modeling, and activation of acquisition and lifecycle marketing data across multiple platforms. We build and maintain large scale data pipelines that ingest vendor data (paid media, mobile attribution, search, social, display, email, and more), land it in cloud storage, and transform it into analytics-ready datasets powering acquisition reporting tools, dashboards, and executive insights. You will be a member of the engineering team driving multiple transformations from end‑to‑end. The Data Engineer I will support the design, development, and maintenance of data pipelines and transformation workflows that power acquisition reporting and marketing analytics. You will work across AWS, Databricks, Unity Catalog, Snowflake, and Airflow to help build reliable and scalable solutions for ingesting and preparing marketing platform data. You will collaborate with senior engineers, analytics partners, and marketing stakeholders to ensure data accuracy, consistency, and timely delivery for downstream dashboards and reporting. This role involves hands-on development, troubleshooting, and contributing to the ongoing modernization of our data ecosystem.

Requirements

  • Strong proficiency in SQL (analytical SQL, complex joins, window functions).
  • Hands-on experience with PySpark and/or Spark SQL in production.
  • Good understanding of data modeling, ETL/ELT design patterns, and distributed data processing.
  • Experience building pipelines in Databricks, including Delta Lake, Unity Catalog, data governance, and Lakehouse patterns.
  • Experience in AWS (S3, IAM, EC2, Glue, Lambda, or related services).
  • Experience with Airflow or similar orchestration tools.
  • Experience building robust ingestion pipelines and working with semi‑structured formats (JSON, Parquet, CSV).
  • Experience with Git/GitHub, CI/CD, and modern DevOps practices.
  • Excellent communication skills and ability to work with cross‑functional partners.
  • Bachelor’s degree in Computer Science, Information Systems, Software, Advanced Mathematics, Statistics, Data Engineering or comparable field of study, and/or equivalent work experience

Responsibilities

  • Assist in building and maintaining ETL/ELT pipelines for acquisition reporting using Databricks, PySpark, SQL, and Unity Catalog under the guidance of senior engineers.
  • Support the migration of existing Snowflake SQL scripts and transformations into Databricks UC by updating queries, validating outputs, and helping implement governance best practices.
  • Contribute to developing ingestion processes for marketing vendor data, including data parsing, normalization, and quality validations.
  • Implement and maintain foundational data quality checks, monitoring alerts, and issue triage workflows using Databricks, Snowflake, Airflow, and internal tooling.
  • Partner with the Data Reliability Engineering team to assist with SLA monitoring, simple incident troubleshooting, and logging improvements.
  • Collaborate with analytics and marketing partners to understand data requirements and ensure accuracy of datasets used in dashboards and reporting.
  • Support performance tuning, logging improvements, and general pipeline reliability work.
  • Participate in engineering best practices, including code reviews, documentation, and contributing to shared frameworks and tools.

Benefits

  • A bonus and/or long-term incentive units may be provided as part of the compensation package, in addition to the full range of medical, financial, and/or other benefits, dependent on the level and position offered.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service