Enterprise Data & AI Intern

AmplitudeSan Francisco, CA
18hHybrid

About The Position

Join Amplitude's Enterprise Data and Insights team for a 12-week immersive internship at the intersection of data analytics and generative AI. You'll work alongside senior data scientists, analysts, and AI engineers to solve real business problems using both traditional analytics and cutting-edge AI techniques. This isn't a passive learning experience — you'll contribute to live projects from day one. As an Enterprise Data & AI Intern, you will: Design and build dashboards and data pipelines that surface insights for stakeholders, including writing SQL queries against large datasets and contributing to data models in our analytics stack. Work on applied GenAI initiatives such as prototyping LLM-powered tools for internal workflows and exploring retrieval-augmented generation (RAG) architectures. Gain hands-on experience with tools including Snowflake, DBT, Glean, and Workato, as well as prompt engineering and AI evaluation frameworks. Partner with business teams to translate ambiguous questions into structured analyses and communicate data-driven recommendations to non-technical audiences. Participate in sprint planning, standups, and design reviews, culminating in a final project presentation showcasing the impact of your work over 12 weeks.

Requirements

  • Foundational SQL skills and working knowledge of Python, including libraries such as pandas and numpy.
  • Curious about how AI is reshaping how companies use data, with a bias toward learning by building.
  • Strong communicator who asks good questions, documents their work clearly, and is comfortable diving into ambiguous problems.
  • Currently pursuing a degree in Computer Science, Data Science, Statistics, Information Systems, or a related technical field.
  • Ability to commit to 40 hours per week for the full 12-week summer program, with availability to relocate to San Francisco and work from our HQ on a hybrid schedule (2–3 days in-office per week).
  • Cross-functional communication by clearly presenting your contributions and their impact at the final project presentation.

Nice To Haves

  • Exposure to tools like dbt, Snowflake, LangChain, OpenAI APIs, or BI tools such as Looker or Tableau is a plus but not required.

Responsibilities

  • Design and build dashboards and data pipelines that surface insights for stakeholders, including writing SQL queries against large datasets and contributing to data models in our analytics stack.
  • Work on applied GenAI initiatives such as prototyping LLM-powered tools for internal workflows and exploring retrieval-augmented generation (RAG) architectures.
  • Gain hands-on experience with tools including Snowflake, DBT, Glean, and Workato, as well as prompt engineering and AI evaluation frameworks.
  • Partner with business teams to translate ambiguous questions into structured analyses and communicate data-driven recommendations to non-technical audiences.
  • Participate in sprint planning, standups, and design reviews, culminating in a final project presentation showcasing the impact of your work over 12 weeks.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service