Apple-posted 4 months ago
Senior
Cupertino, CA
5,001-10,000 employees

Apple Services Engineering (ASE)’s Solutions Architecture & Platform team is looking for technically expert Software Engineer to design, develop, enhance, test, and support software and software engineering of its platforms. Our platforms power secure and reliable processing of metadata and business operations of Apple’s internal services at scale, including Apple’s Digital Content to enable the availability of content to the store front, reporting, and software development of internal applications and tooling. A key focus of this role would be to develop software for a stream processing platform within a data mesh architecture, while collaborating with the engineers on the team and engineering teams across ASE. The scale and scope are complex and require someone who has passion for solving difficult problems, conducts technical due diligence, and looks at the big picture when solving them.

  • Design and build critical platforms, services, and tools that enable engineers across Apple to build secure and scalable services.
  • Collaborate with engineering teams across ASE to understand the needs of stakeholders.
  • Align on goals, design, and deliver high quality software that meets Apple's standards and scale.
  • 10+ years of experience in Software Engineering.
  • Strong coding experience in Java and Python.
  • Extensive experience in building distributed stateful microservices systems using RPC and Event Driven methodology.
  • Familiarity with Domain Driven Design approach.
  • Extensive experience in building analytics systems using event driven methodology.
  • Deep experience in API design, service oriented architecture (SOA), large scale distributed systems and asynchronous patterns with data guarantees.
  • Experience in developing fault tolerant systems in multi-DC environments.
  • Strong knowledge of open-source stream and batch processing platforms, such as Spark, Flink, Kafka; and data formats - Avro, Protobuf.
  • Experience working with Data Lakehouse technologies (e.g. Apache Iceberg).
  • Experience with large dataset storage solutions (HDFS, S3).
  • Experience with RPC protocols (REST, gRPC).
  • Excellent written and oral communication skills.
  • Strong knowledge of SQL and NoSQL data-stores.
  • Experience using Splunk and OpenTelemetry.
  • Experience with Cloud Computing platforms (particularly AWS, k8s).
  • Knowledge of Security (AuthZ/AuthN, mTLS, HTTPS).
  • CDN knowledge.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service