2026 Intern, Bixby Edge AI, Language (Summer)

Samsung Research America InternshipMountain View, CA
3h$36 - $48Onsite

About The Position

Bixby is an intelligent personal assistant which is only available as a built-in application on Samsung flagship devices and wearables. This application uses Natural Language Understanding to perform tasks on these devices using voice/ text, including but not limited to making phone calls, sending text messages, setting up meetings, opening apps, setting alarms and timers, getting directions, answering general questions, providing information about restaurants and other businesses, etc. This position focuses on designing, prototyping, and advancing cutting-edge AI systems for language intelligence, personalization, and adaptive reasoning. You'll work at the intersection of research and engineering, contributing to both exploratory research and scalable implementations with a strong emphasis on efficient and privacy-aware on-device AI. Work on next-generation personalized and agentic AI systems with real-world impact Collaborate in an environment that values both research depth and practical engineering excellence

Requirements

  • Currently pursuing a BS/MS in Computer Science or related field
  • Excellent Python, Java/Kotlin programming skills
  • Strong experience with modern ML frameworks (e.g., PyTorch, TensorFlow, JAX)
  • Proficiency in data preparation, cleaning, and visualization techniques
  • Solid understanding of language models, retrieval-based systems, or knowledge-augmented AI
  • Experience designing and implementing end-to-end AI systems, from modeling to evaluation
  • Passion for writing sophisticated, maintainable, and readable code
  • Excellent communication and collaboration skills

Nice To Haves

  • Android framework development experience is a plus

Responsibilities

  • Design and implement AI models and systems for language understanding and generation
  • Develop and prototype retrieval-based AI systems
  • Contribute to agentic reasoning and planning systems
  • Research and implement self-evolving and adaptive memory mechanisms that support model inference on constrained memory environments
  • Explore continual and incremental learning approaches that enable models to adapt over time
  • Contribute to on-device intelligence research, including: Efficient model architectures Memory- and computation-aware inference Privacy-preserving and edge-friendly learning techniques
  • Collaborate with cross-functional research and engineering teams to transition ideas into prototypes and product-ready solutions
  • Contribute to internal publications, patents, and external research venues
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service