Research Internships at Microsoft provide a dynamic environment for research careers with a network of world-class research labs led by globally-recognized scientists and engineers, who pursue innovation in a range of scientific and technical disciplines to help solve complex challenges in diverse fields, including computing, healthcare, economics, and the environment. Improving efficiency of Large Language Models (LLMs) is critical to deploying large models, and training (coupled with other techniques such as quantization) is a promising tool for improving efficiency. This Research Internship will design training algorithms and apply them to improving the quality/efficiency trade-offs of large language models, with a focus on resource-constrained environments. Possible directions of investigation include: designing new algorithms for quantized model fine-tuning; leveraging training to improve the token efficiency of reasoning models; proposing and implementing systems optimizations to scale training under resource constraints.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Career Level
Intern
Industry
Publishing Industries
Education Level
Ph.D. or professional degree
Number of Employees
5,001-10,000 employees