2026 Data Warehouse Management Co-op

General Atlantic - CampusRochester, NY
5h$32 - $38

About The Position

General Atlantic is a leading global investor with more than four and a half decades of experience providing capital and strategic support for over 830 companies throughout its history. Established in 1980, General Atlantic continues to be a dedicated partner to visionary founders and investors seeking to build dynamic businesses and create long term value. Guided by the conviction that entrepreneurs can be incredible agents of transformational change, the firm combines a collaborative global approach, sector specific expertise, a long-term investment horizon, and a deep understanding of growth drivers to partner with and scale innovative businesses around the world. The firm leverages its patient capital, operational expertise, and global platform to support a diversified investment platform spanning Growth Equity, Credit, Climate, and Sustainable Infrastructure strategies. General Atlantic manages approximately $118 billion in assets under management, inclusive of all strategies, as of September 30, 2025, with more than 900 professionals in 20 countries across five regions. For more information on General Atlantic, please visit: www.generalatlantic.com. Position Summary This role supports the Data Warehouse Management team in New York City, working closely with engineers and stakeholders to support data-driven initiatives across the organization. The role provides opportunities to apply technical and analytical skills to build and optimize data pipelines with modern data engineering practices and cloud-based data platforms.

Requirements

  • Eligible to participate to Northeastern University’s undergraduate Co-op program
  • Current Northeastern University student pursuing a degree in Computer Science, Information Science, or a related field
  • Strong programming ability in two or more languages (e.g., Python, Java, Scala, SQL)
  • Ability to work in a fast-paced environment with professionalism, accuracy, and strong attention to detail
  • Proactive mindset with the ability to solve problems, anticipate needs, and manage multiple tasks
  • Strong work ethic, flexibility, and a collaborative “can-do” attitude
  • Excellent written and verbal communication skills

Nice To Haves

  • Experience with cloud data technologies, ideally Microsoft Azure (Azure Data Factory, Synapse, Databricks, etc.)
  • Familiarity with Python, SQL, or other data-oriented languages
  • Exposure to data modeling, ETL/ELT workflows, or big data frameworks (Spark preferred)
  • Understanding of database concepts and performance considerations

Responsibilities

  • Assist in building and maintaining data pipelines that support analytics, reporting, and operational workflows
  • Develop and optimize data ingestion, transformation, and validation processes
  • Work with engineers and stakeholders to understand data requirements and translate them into scalable solutions
  • Help ensure data quality, reliability, and documentation across systems
  • Participate in code reviews, testing, and continuous improvement efforts

Benefits

  • Competitive compensation at $32-38 an hour
  • Professional development opportunities and ongoing training
  • Collaborative and inclusive work culture with opportunities for advancement
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service