The Motion & Interaction team has created intuitive experiences for our customers through motion sensing. When you simply raise your wrist, shake your head, or move your device to interact, it’s the work of engineers and scientists on this team. Our fingerprints can be found across core capabilities and experiences on iPhone, Watch, AirPods, Vision Pro, and other Apple products. We are a multidisciplinary team that operates at the intersection of algorithms, software, hardware, and design. We come from diverse backgrounds in signal processing, machine learning, software engineering, statistics, controls, firmware development, and more. As a member of our dynamic group, you will have a unique opportunity to work cross-functionally to develop products and features that impact the lives of millions of users worldwide a daily basis. DESCRIPTION We are seeking a talented, self-motivated machine learning engineer to build Apple’s next-generation features and experiences using multi-modal sensing. In this role, you will ideate, design, and implement models & algorithms, while optimizing for power, memory, and performance. You will be working on motion sensing-related features, including sensor fusion and interactive technologies.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Mid Level