Walt Disney-posted 3 months ago
$138,900 - $203,900/Yr
Mid Level
San Francisco, CA
5,001-10,000 employees
Motion Picture and Sound Recording Industries

Disney Entertainment and ESPN Product & Technology is a global organization of engineers, product developers, designers, technologists, data scientists, and more - all working to build and advance the technological backbone for Disney's media business globally. The team marries technology with creativity to build world-class products, enhance storytelling, and drive velocity, innovation, and scalability for our businesses. We are Storytellers and Innovators. Creators and Builders. Entertainers and Engineers. We work with every part of The Walt Disney Company's media portfolio to advance the technological foundation and consumer media touch points serving millions of people around the world. This person will be working on the ACP project, the unification and migration project of subscription data model. The person will be working in a cross-team environment, pairing with the lead data engineer to design and implement the new data pipeline in an on-prem environment while ensuring backward compatibility and zero downtime. This person will be engaging the Nova modernization project to help on refactoring legacy data pipelines including all facet data of demographic, subscription, watch behavior, web page interactions, etc. The person will optimize the data quality with new sources and support integrations with different systems. The person will have the opportunity to be exposed to both Hadoop ecosystem (hdfs, hive, spark, spark) and modern cloud-based data technologies (redshift, snowflake, etc) and services (java spring boot). The one will be having the opportunity to deal with both batch and real-time data.

  • Contribute to the Nova system. Add new features and continuously optimize the current asset.
  • Contributing to ACP and Nova modernization project.
  • Promote and support Agile methodologies such as Scrum, Kanban, and Scrumban by actively participating in regular ceremonies such as stand-up, retrospectives and sprint planning.
  • On-call support.
  • Collaborate with your squad, Product Managers, Designers, QA, Operations, and other stakeholders to understand requirements and articulate technical decisions and outcomes.
  • Minimum of 5 of years related work experience.
  • Experience with building batch/real-time data pipelines in Spark.
  • Experience and/or certification with Hadoop ecosystem (Hive, hdfs, etc).
  • Experience in a Java and/or alternative JVM application development environments (Scala).
  • Excellent understanding of software development fundamentals.
  • Experience with standard CI/CD procedures & tools like Jenkins, GitHub action, spinnaker.
  • Experience working in a modern, agile software team with version control & project management tools (e.g., Git, SVN, Jira, Basecamp).
  • Experience with Flink.
  • Knowledge or certification with AWS services, including ECS (Docker), S3, EC2, Lambda, CloudWatch.
  • Experience with big data technologies & tools including Databricks, EMR, Flink.
  • Experience with Terraform or other infrastructure as code tooling.
  • Experience with graph-based data workflows such as Apache Airflow, Meson.
  • Experience building scalable, fault-tolerant, high-uptime systems.
  • Experience with writing unit, integration, and functional tests.
  • Proven ability to integrate with service APIs and/or SDKs.
  • Creative and inventive problem solving.
  • Experience mentoring other developers.
  • A bonus and/or long-term incentive units may be provided as part of the compensation package.
  • Full range of medical, financial, and/or other benefits.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service