Ripple-posted 1 day ago
Intern
San Francisco, CA
501-1,000 employees

At Ripple, we’re building a world where value moves like information does today. It’s big, it’s bold, and we’re already doing it. Through our crypto solutions for financial institutions, businesses, governments and developers, we are improving the global financial system and creating greater economic fairness and opportunity for more people, in more places around the world. And we get to do the best work of our career and grow our skills surrounded by colleagues who have our backs. If you’re ready to see your impact and unlock incredible career growth opportunities, join us, and build real world value. THE WORK As a Ripple intern, you will work on challenges that impact our mission and shape our company. With support from your team, access to learning and development resources, and opportunities for fun along the way, our program will give you the foundation to start your career journey. As a Software Engineer Intern, you will support the development of Ripple’s central Data Platform, contributing to robust data design and governance practices. You will help shape the data foundations that enable analytics, machine learning, and key business operations across the company. Your work will involve assisting in the creation and maintenance of high-quality data dictionaries, supporting the development of data standards, and applying best practices to ensure consistency, reliability, and trust in Ripple’s data assets. You are curious about how data moves through complex systems and eager to identify data quality issues, structural inefficiencies, and opportunities to improve accuracy, scalability, and performance. You will participate in architectural and design discussions, learning how to evaluate tradeoffs and help build clean, well-modeled, and well-documented data structures.

  • Collaborate with teams to define and maintain a comprehensive data dictionary for consistent data definitions across systems.
  • Implement data quality frameworks to ensure accuracy, completeness, and reliability of data.
  • Establish and carry out data governance policies to maintain compliance, security, and proper data usage.
  • Design and promote best practices for data modeling, pipelines, and infrastructure to support scalable and maintainable solutions.
  • Partner with stakeholders to ensure data structures and processes align with business objectives and analytical needs.
  • Currently enrolled in an Undergraduate, Graduate or PhD program, preferably in a science or quantitative field.
  • Available to work for 12 weeks during Summer 2026, beginning in May or June.
  • Intent to return to degree-program after the completion of the internship.
  • Coursework / previous intern experience with software engineering, ideally involving data oriented applications (Python, Java or other programming languages).
  • Experience in building ETL and ELT data pipelines.
  • Experience writing SQL queries in data warehouses such as Redshift, BigQuery.
  • Knowledge of building REST API endpoints.
  • Excellent written and verbal communication skills and attention to detail with a commitment to excellence.
  • Real time pipelines are big plus.
  • Exposure to Hadoop and NoSQL databases like hbase, cassandra, etc is a plus.
  • Exposure to CI/CD (via Airflow or similar tool) a plus.
  • Experience with Terraform or similar tools a huge plus.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service