Our team is part of Amazon’s Personalization organization, a high-performing group that leverages Amazon’s expertise in machine learning, big data, distributed systems, and user experience design to deliver the best shopping experiences for our customers. We run global experiments and our work has revolutionized e-commerce with features such as "Keep shopping for", “Customers who bought this item also bought”, and “Frequently bought together” among others. We are building the next generation of personalized shopping experiences at Amazon through deep understanding of our customer's intent and our product catalog. We aim to create an experience akin to that of a talented personal shopping assistant — a partner that is knowledgeable, understands your preferences, and helps you find the right solution for your needs. We hope you will join us! Key job responsibilities As an Applied Scientist on the team you will be working on ways to help customers find the right products on their shopping journey. You will hone your skills in areas such as Multimodal LLM post-training and cross-modal vision-language reasoning, while building scalable, agentic industry-grade systems. To be highly successful in this role, the following background is preferred (or expected to be ramped up quickly): A strong Computer Vision foundation Familiarity with multimodal encoders, with hands-on experience training multimodal models End-to-end ML pipeline experience, spanning data curation, model training, and production deployment Experience with LLMs, particularly using LLM-as-a-judge for synthetic data generation Experience with online experimentation, including experiment setup and post-analysis Expertise in multimodal domain generalization is a strong plus
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Mid Level