About The Position

Launching your tech career at Intact means joining a diverse team of more than 3,000 Digital, Data and Tech experts working at the intersection of what exists and what’s possible. Here, you’ll be supported by forward-thinking leaders who celebrate shared success, and you’ll help push the industry forward with digital solutions that go beyond insurance to offer everyday value to millions of people. You’ll grow personally and professionally with access to cutting-edge technology-driven learning platforms and make lasting connections near and far. Most importantly, you’ll discover how exciting the “real world” can be. Here, your career will take off 🚀 We are looking for Data Engineering Developer interns for our growing team!

Requirements

  • Currently pursuing a Bachelor's or Master's in artificial intelligence, data science, computer science, computer engineering, mathematics, statistics or any related field.
  • Available to work with us full-time, 35 hours per week, for the Fall 2026 Term (September to December).
  • Must be an active student during your internship and/or returning to school in next Winter 2027 following your internship.
  • For candidates located in Quebec, bilingualism is required considering the necessity to interact on a regular basis with English speaking colleagues across the country.
  • No Canadian work experience required, but must be eligible to work in Canada.
  • A strong enthusiasm for data engineering and cloud technologies, and a solid understanding of data, Big Data challenges, database systems, machine learning, and artificial intelligence.
  • Proficiency in Python and SQL is required.

Nice To Haves

  • Knowledge of PySpark and AWS is an asset.

Responsibilities

  • Participate in the development and delivery of efficient, effective and secure data flows.
  • Build, optimise and maintain data ELT/ETL pipeline to ingest data into Intact’s enterprise data platform leveraging Python and Databricks.
  • Contribute to Intact’s best practice to depersonalize datasets.
  • Work on data modeling and building reporting capabilities.
  • Work on certifying the data as a trusted asset by building data reconciliation pipeline.
  • Participate in the development of data quality controls.
  • Learn the best practices with the use of cutting-edge technologies such as AWS, Databricks, and Snowflake.
  • Join a team of cross-functional members, collaborating closely with developers, devops, data scientists, and end users.

Benefits

  • Support
  • Opportunities
  • Performance-led financial rewards
  • Workplace where you can shape the future, win as a team and grow with us
  • Access to cutting-edge technology-driven learning platforms
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service