Data Engineering Intern

IntuitiveSunnyvale, CA
2d

About The Position

Primary Function of Position The Ion endoluminal system is Intuitive’s robotic platform for minimally invasive biopsy in the peripheral lung, with the goal of improving early lung cancer diagnosis. Ion’s Data Engineering team builds the infrastructure, data pipelines, and services that power applications and product insights across the organization. As a Data Engineering intern, you will help design and develop robust, performant data pipelines and supporting applications. You will work closely with data engineers, data scientists, and software teams to deliver high-quality solutions that drive real impact. Essential Job Duties Contribute to the design and implementation of log-based data pipelines, real-time event streams, and/or data models. Contribute to monitoring, alerting, and data quality checks to ensure pipeline reliability. Collaborate with data scientists and adjacent software teams to ensure pipelines and data models meet their needs. Help create and maintain technical documentation. Participate in code reviews, sprint planning, and standups.

Requirements

  • University Enrollment: Must be currently enrolled in and returning to an accredited degree-seeking academic program after the internship.
  • Internship Work Period: Must be available to work full-time (approximately 40 hours per week) during a 10–12-week period starting May or June. Specific start dates are shared during the recruiting process.
  • Current enrollment in Computer Science, Data Science, Computer Engineering, Electrical & Computer Engineering, or related degree-seeking program at the bachelor’s level or above.

Nice To Haves

  • Proficiency in Python, SQL, and related languages.
  • Experience with Snowflake, Databricks, AWS, Airflow, Kafka, dbt.
  • Familiarity with Git/version control, code reviews, unit testing, and CI/CD workflows.
  • Strong problem-solving skills and a collaborative mindset.
  • Strong communication skills and the ability to work in a team-oriented environment.

Responsibilities

  • Contribute to the design and implementation of log-based data pipelines, real-time event streams, and/or data models.
  • Contribute to monitoring, alerting, and data quality checks to ensure pipeline reliability.
  • Collaborate with data scientists and adjacent software teams to ensure pipelines and data models meet their needs.
  • Help create and maintain technical documentation.
  • Participate in code reviews, sprint planning, and standups.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service