About The Position

In the Global Products Group, we are dedicated to excellence in the design and engineering of Lam's etch and deposition products. We drive innovation to ensure our cutting-edge solutions are helping to solve the biggest challenges in the semiconductor industry. We are seeking a Data Scientist Intern to join the Equipment Intelligence team in the Deposition Product Group. Equipment Intelligence operates at the intersection of physics‑based modeling, big data analytics, machine learning, and advanced control systems. As a Data Scientist Intern, you will collaborate with a team of highly motivated, agile engineers to contribute to: Development of analytics pipelines and platforms leveraging big‑data and machine‑learning techniques to support a global installed base of wafer fabrication equipment Creation and training of deep learning and machine learning models for equipment performance characterization, anomaly detection, predictive maintenance, and optimization Integration of empirical learning methods with physics‑based or first‑principles models Exploration, cleaning, and analysis of complex, high‑volume equipment datasets Supporting model validation, deployment workflows, and documentation of findings

Requirements

  • Currently enrolled in a Bachelor’s or Master’s program in Computer Science, Data Science, Electrical Engineering, Mechanical Engineering, Applied Physics, Materials Science, or a related quantitative field
  • Able to intern for at least 3 months preferrably 6-9 months if available.
  • Strong analytical, quantitative, and problem‑solving skills
  • Ability to learn new tools, modeling techniques, and domain knowledge quickly
  • Ability to work independently on scoped tasks and collaborate effectively within multidisciplinary teams
  • Strong written and verbal communication skills

Nice To Haves

  • Coursework or project experience in machine learning, deep learning, statistical learning, or data mining
  • Experience building models using modern ML/DL frameworks (e.g., PyTorch, TensorFlow, JAX, Scikit‑learn)
  • Familiarity with distributed compute environments (cloud platforms, Spark, Ray, or HPC systems)
  • Experience with Python for data science (NumPy, Pandas, Matplotlib, etc.)
  • Experience working with large datasets, time‑series data, or sensor/telemetry data
  • Familiarity with experiment design, model validation, or data pipelines
  • Interest in semiconductor manufacturing, advanced equipment, or applied physics
  • Proficiency in Python or similar high‑level languages
  • Understanding of core ML/DL concepts and ability to implement models from examples or academic references
  • Comfort working with modern development tools (Git, notebooks, VS Code, containerization, etc.)
  • Ability to present complex quantitative concepts clearly and visually
  • Curiosity and willingness to learn in a highly multidisciplinary environment

Responsibilities

  • Collaborate with Hardware, Process, and Software engineering teams to define data requirements and guide data‑driven product development
  • Communicate insights, results, and visualizations to internal engineering teams and, when appropriate, global customers
  • Participate in experiment planning with engineering teams and support interpretation of experiment results
  • Contribute to improving internal tooling, workflows, and automation
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service