Data Governance and Data Engineering Intern

Upbound GroupPlano, TX
Onsite

About The Position

Upbound Group's 2026 Summer Intern Program offers an immersive professional development experience that goes beyond traditional internships, providing participants with a comprehensive opportunity to grow both personally and professionally through dynamic networking events, engaging volunteer activities, and exclusive executive leadership speaker sessions. Interns can anticipate a robust learning environment that balances structured training modules with self-paced learning pathways, enabling them to explore their potential, develop critical skills, and gain meaningful insights into corporate culture. The ten-week program will run from June 1 through August 7, 2026, and is an in-office experience at our Plano headquarters in Plano, TX. Join Upbound Group’s (UPBD) Data Governance team this summer for a hands-on internship at a $4.6 billion financial services firm. We are looking for a motivated student to help us in our data quality, data catalog, and data lineage initiatives. You will use Python, metadata, SQL, and AI agents to create data quality checks that run against Snowflake data assets. You may be asked to help document and trace the flow of data within our enterprise, which will require some familiarity with YAML and GitHub. You will work directly with leadership to solve real-world data challenges in a professional, large-scale environment. This role is intended for individuals who are excited about data science, data engineering, data management, or data analytics. As such, we prefer those majoring in Information Management, Computer Science, Software Engineering, or any Data discipline. Those from other areas of study are encouraged to apply if they have the requisite skills and interests.

Requirements

  • Some experience with a relational database like PostgreSQL, SQL Server, Oracle, DuckDB, MariaDB, Snowflake, or MySQL.
  • Experience writing SQL
  • Comfortable with and can explain CTEs, subqueries, EXISTS, inner and outer and lateral joins, MINUS/INTERSECT/EXCEPT/UNION, GROUP BY, HAVING, and windowing clauses.
  • Some experience coding with Python
  • Have played with and built something useful with AI

Nice To Haves

  • Worked with any datasets in the millions of rows.
  • Used advanced AI models to augment coding activities.

Responsibilities

  • Help design and implement enterprise-grade data quality monitoring, including deterministic validation rules, reconciliation and balance checks, and analytical techniques such as statistical profiling and anomaly detection, to ensure accurate and trustworthy financial data
  • Map complex data flows to ensure transparency and traceability across the organization.
  • Assist in examining and improving pipelines built on Fivetran, Python, and Snowflake.
  • Gain professional experience working with Python, GitHub, agentic AI, Snowflake, and PostgreSQL.
  • Work alongside senior leaders in data science, data engineering, data analytics, finance, marketing, sales, and operations to understand the cost of poor data quality, and how data governance drives impactful outcomes in the financial services industry.

Benefits

  • Paid hourly in accordance with usual payroll procedures.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service