Data Science Engineer Internship - Summer 2026

CCC Intelligent SolutionsChicago, IL
3d$20 - $43

About The Position

CCC Intelligent Solutions Inc. (CCC) is a leading cloud platform for the multi-trillion-dollar insurance economy , creating intelligent experiences for insurers, repairers, automakers, part suppliers, and more. At CCC, we’re making life just work by empowering more than 35,000 businesses with industry-leading technology to get drivers back on the road and to health quickly and seamlessly. We’re pushing boundaries with innovative AI solutions that simplify and enhance the claims and repair journey. Through purposeful innovation and the strength of its connections, CCC technologies empower the people and industry relied upon to keep lives moving forward when it matters most. Learn more about CCC at www.cccis.com . The Role Our program is designed to #CCCJumpstart your career! At CCC, you will work and learn alongside innovative and inspiring leaders and gain valuable technical experience while working on real business solutions in a corporate setting.

Requirements

  • In order to be considered for this role you are required to be in pursuit of an Associate, Bachelor's or Master's throughout your internship.
  • We are looking for candidates who offer strong collaborative skills and work well with a team.
  • A strong interest in computer science and/or related fields is essential to the success of each intern.
  • If you have knowledge, through coursework or project work, of Python, Machine Learning, or Statistics/ Modeling, this opportunity is right for you.

Responsibilities

  • Technology: Airflow Pyspark Kafka and Spark Streaming Machine Learning ETL
  • Building end to end pipelines to create fully curated and enhanced data sets for the auto-insurance industry as well as internal teams.
  • Use pyspark, spark streaming and kafka to ingest different types of data, including relational databases, json and zipped xml data.
  • Produce software data building blocks, data models, and data flows for varying client demands, such as dimensional data, data feeds, dashboard reporting, and data science research and exploration.
  • Utilize Apache Airflow to create complex DAGs to handle complex tasks such as handling cross dag dependencies, orchestrating concurrent pipelines, monitoring tasks and alerting when SLAs are not met.
  • Technical environment: Programming with Python, Spark; Open-source big data tools (Hive, Spark, Kafka); AWS eco-system (Amazon EMR, S3, Presto); Airflow for scheduling and monitoring of big data ETL pipelines; SQL for data profiling and data validation; Unix commands and scripting; Hadoop fundamentals and architecture. Machine learning Fundamentals, Generative AI and Agentic AI.

Benefits

  • 401K Match
  • Paid time off
  • Annual Incentive Plan
  • Performance Bonus
  • Comprehensive health insurance
  • Adoption Assistance
  • Tuition Reimbursement
  • Wellness Programs
  • Stock Purchase Plan options
  • Employee Resource Groups
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service