Ripple-posted 5 days ago
Intern
San Francisco, CA
501-1,000 employees

At Ripple, we’re building a world where value moves like information does today. It’s big, it’s bold, and we’re already doing it. Through our crypto solutions for financial institutions, businesses, governments and developers, we are improving the global financial system and creating greater economic fairness and opportunity for more people, in more places around the world. And we get to do the best work of our career and grow our skills surrounded by colleagues who have our backs. If you’re ready to see your impact and unlock incredible career growth opportunities, join us, and build real world value. THE WORK As a Ripple intern, you will work on challenges that impact our mission and shape our company. With support from your team, access to learning and development resources, and opportunities for fun along the way, our program will give you the foundation to start your career journey. As a Software Engineer Intern, you will be one of the core builders within Ripple’s central Data Engineering. This team implements the data ingestion and transformation for analytics, machine learning and powering various business functions at Ripple. You are curious about the bottlenecks and failure modes of a system and look for opportunities to continually improve cost/performance characteristics. You are hands-on in driving key technical decisions, ensuring the right tradeoffs are made to deliver high-quality results and deliver high, measurable customer value. You work well across functions and teams, including data science, product, application engineering, compliance, finance and others. Your passion for good engineering is complemented by strong instincts to deliver value.

  • Highly efficient in shipping solutions to both large and small projects.
  • Owns the development and release for assigned tasks, proof of concept and new mini projects.
  • Writes clean tech specs and identifies risks before implementation.
  • Recognizes trade-offs and identifies impact/risks between alternative solutions.
  • Improves code structure and participates in architecture discussions in data pipelines.
  • Able to break down work into sprints and tasks.
  • Currently enrolled in an Undergraduate or Graduate degree preferably in a Computer Science or related field.
  • Available to work for 12 weeks during Summer 2026, beginning in May or June.
  • Intent to return to degree-program after the completion of the internship.
  • Experience in at least one programming language (e.g. Python, Scala) and comfortable working with SQL.
  • Experienced in at least one Data Warehouse or data lake platforms such as Databricks.
  • Ability to write sophisticated code and comfortable with picking up new technologies independently.
  • Enjoy helping teams push the boundaries of analytical insights, creating new product features using data, and powering machine-learning models.
  • Familiar with developing distributed systems with experience in scalable data pipelines
  • Familiar with data technologies like Spark or Flink and comfortable in engineering data pipelines using these technologies on financial datasets.
  • Experience with RESTful APIs and server-side APIs integration.
  • Highly or conceptually familiar with AWS cloud resources (S3, Lambda, API Gateway, Kinesis, Athena, etc.)
  • Experience in orchestrating CI/CD pipelines using GitLab, Helm, and Terraform.
  • Appreciate the importance of excellent documentation and data debugging skills.
  • Excited about operating independently, demonstrating perfection, and learning new technologies and framework.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service