Software Development Engineer - Data

AppleCupertino, CA

About The Position

The Apple Services Engineering team is one of the most exciting examples of Apple’s long-held passion for combining art and technology. These are the people who power the App Store, Apple TV, Apple Music, Apple Podcasts, and Apple Books. And they do it on a substantial scale, meeting Apple’s high expectations with dedication to deliver a huge variety of entertainment in over 35 languages to more than 150 countries. These engineers build secure, end-to-end solutions. They develop the custom software used to process all the creative work, the tools that providers use to deliver that media, all the server-side systems, and the APIs for many Apple services. Thanks to Apple’s unique integration of hardware, software, and services, engineers here partner to get behind a single unified vision. That vision always includes a deep dedication to strengthening Apple’s privacy policy, one of Apple’s core values. Although services are a bigger part of Apple’s business than ever before, these teams remain small, forward-thinking, and multi-functional, offering greater exposure to the array of opportunities here. The Commerce Engineering team is looking for an experienced, hardworking and proactive software engineer to lead several data architecture and technical compliance efforts across the commerce landscape. Join our exciting engineering team that has been leading the digital distribution industry by constantly developing innovative features to grow and expand the iTunes Store, App Store, iBooks Store and services such as Apple Music, TV+ and Arcade. The position requires someone comfortable with all aspects of the software design lifecycle and experienced with high performance data distributed systems. If you thrive in a dynamic multi-functional environment and are able to shift with constantly evolving requirements and new technologies, you’ll love it here! You will be an integral part of the team that is in an outstanding position to develop and build the commerce data platform. You will participate in the instrumentation and ingestion of events and collaborate with partners on data pipelines and datasets, ensuring that data is handled in a privacy-safe manner. This platform allows downstream customers to use the unified data foundation to achieve consistency, quality, and efficiency. You will have experience to design and implement large scale commerce datasets and backend services. Those data systems will be applied by many downstream teams for data analysis, metric reporting, data insights, model training and evaluation. You will implement data storage solutions that scale, enrich and surface data quality and pipeline metrics. In addition, you must be able to investigate and adopt new technology, be comfortable working in a fast-paced environment, and have a “can-do" attitude

Requirements

  • 2+ years experience in developing ETL jobs for analyzing and processing high-volume data in Apache Spark, Flink, Kafka and Iceberg.
  • Expert knowledge of one or more object-oriented programming languages (Scala/Java).
  • Proficient at schema design, data modeling concepts and SQL.
  • Excellent problem-solving and analytic skills.
  • Ability to program in scripting languages such as Python.
  • Experience using NoSQL solutions like Cassandra, Voldemort, Memcached.
  • Experience with streaming and batch data processing.
  • Experience with GenAI tools to enhance developer efficiency
  • BS in Computer Science or Software Engineering.

Nice To Haves

  • MS preferred.
  • Experience with workflow management tools: Airflow.
  • Ability to learn and research new technologies rapidly.
  • Passion for customer privacy and experience with applying data encryption and data security standards.
  • Strong interpersonal skills and experience working on multi-functional projects.
  • Experience developing large-scale backend storage systems.

Responsibilities

  • Lead several data architecture and technical compliance efforts across the commerce landscape.
  • Develop and build the commerce data platform.
  • Participate in the instrumentation and ingestion of events.
  • Collaborate with partners on data pipelines and datasets, ensuring data is handled in a privacy-safe manner.
  • Design and implement large scale commerce datasets and backend services.
  • Implement data storage solutions that scale, enrich and surface data quality and pipeline metrics.
  • Investigate and adopt new technology.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service