About The Position

In this role, you will work across the stack - from the system software and frameworks to end-user interfaces - to build Apple Intelligence features that enrich people’s lives. This is a highly cross-functional role working with design, algorithms, software, services, privacy, security, performance, and sometimes even HW / Silicon teams across Apple to engineer end-to-end solutions. You will need to work quickly and creatively to help demonstrate the viability of ideas and technologies while building robust, production-ready systems that millions of users will depend on. Your work will include building UI interfaces and system software using Swift, evaluating AI/ML algorithms for on-device integration, collaborating on user experience design, debugging complex cross-device interactions, and contributing to a culture of shipping high-quality production code. In addition to software engineering, you will also apply excellent UX intuition to identify and shape experience opportunities, iterate with design, and define technical requirements that drive development.

Requirements

  • BS / MS / PhD in Computer Science or equivalent experience 2+ years in software development
  • Excellent programming skills in any programming language (preferably at least one of Swift, Objective-C, C++) and strong understanding of data structures, memory management, and concurrency
  • Skilled at debugging and triaging complex software systems, including learning to work with new technologies
  • Systems thinking, including ability to break down ambiguous problems and drive clarity on critical details
  • Strong intuition for user experience + ability to translate customer needs into technical requirements
  • Solid cross-functional collaboration and technical communication skills

Nice To Haves

  • Shipping customer-facing features or products to production at scale
  • Developing for iOS/MacOS and/or using Apple system frameworks like SwiftUI and ARKit
  • Building & shipping features using LLMs and/or Machine Learning algorithms, including on-device inference, data-driven validation, requirement definition, and collaboration with algorithm teams
  • Processing sensor data, (e.g. image/video, audio, motion), such as using computer vision / signal processing, working with vector transforms, and/or developing AR applications
  • Developing & validating personalization features, including working with sensitive datasources such as conversation transcripts or Health data
  • Developing under strict privacy and security constraints, including techniques like secure data processing, and/or privacy-by-design principles
  • Developing system software or frameworks, including API definition and performance optimization (particularly for resource constrained or real-time systems)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service