Data Engineering Intern

Stride, Inc.
Remote

About The Position

The Data Engineering Intern at Stride supports the development and maintenance of scalable data pipelines and integrations that connect core systems and enable data-driven decision making for online K–12 learning and school operations. This role will work within a collaborative remote team, partnering with data engineers, analysts, and architects to assist in building reliable data workflows and integrations on an agile cycle while gaining hands-on experience in cloud-based data engineering and supporting Stride’s mission to empower online education for students across the United States.

Requirements

  • Currently pursuing a Bachelor’s degree in Computer Science, Data Engineering, Information Technology, or a related field
  • Basic understanding of programming concepts, preferably in Python
  • Familiarity with SQL and working with data in relational databases
  • Foundational knowledge of data structures, data processing, or ETL concepts
  • Strong problem-solving skills and willingness to learn new technologies
  • Ability to work in a collaborative, team-oriented environment
  • Strong verbal and written communication skills
  • Proficiency in Microsoft Office Suite (Excel, Word, PowerPoint)
  • Ability to work up to 40 hours per week during the internship duration
  • Ability to clear required background check

Nice To Haves

  • Exposure to cloud platforms such as AWS, Azure, or GCP through coursework or projects is preferred
  • Coursework or project experience involving data pipelines, APIs, or cloud-based data tools
  • Familiarity with version control systems such as Git
  • Exposure to data warehousing concepts (e.g., Snowflake, Redshift, BigQuery)
  • Interest in education technology and data-driven solutions

Responsibilities

  • Assist in developing and maintaining data pipelines and integrations that support business processes and analytics use cases.
  • Support the design and implementation of data workflows, partnering with engineers and analysts to understand requirements and translate them into technical solutions.
  • Write and maintain clean, modular code in Python and SQL to support data ingestion, transformation, and integration tasks.
  • Assist in testing data pipelines, including unit testing and validation, to ensure data quality and reliability.
  • Support CI/CD processes by helping validate and deploy data solutions in development and staging environments.
  • Monitor data pipelines and assist in troubleshooting issues, identifying root causes, and implementing fixes.
  • Participate in agile ceremonies including sprint planning, standups, and reviews to support team collaboration and learning.
  • Document data processes, workflows, and technical solutions to support team knowledge sharing and maintainability.

Benefits

  • health benefits
  • retirement contributions
  • paid time off
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service