Senior Software Engineer, Data Compute

RobinhoodBellevue, WA
Hybrid

About The Position

Robinhood is building an elite team to tackle the world's biggest financial problems using frontier technologies. They are looking for bold thinkers and sharp problem-solvers who are driven to make an impact. Robinhood is a place for ambitious people to do the best work of their careers, operating with high performance, speed, and ethics. The Data Compute team is a foundational infrastructure group responsible for managing and evolving Robinhood's large-scale Spark and Airflow environments. This team acts as a platform provider for all of Robinhood engineering, supporting everything from real-time analytics to critical compliance and operations workflows. They are currently undertaking major modernization efforts, including migrating workloads to Databricks, adopting serverless patterns, and optimizing lakehouse fundamentals, with a focus on high reliability, cost efficiency, and an exceptional developer experience. As a Senior Software Engineer on the Data Compute team, you will be a key builder of core ingestion and compute primitives. You will design and implement scalable infrastructure to support millions of daily jobs, while modernizing the platform onto Delta Lake and Unity Catalog. Your work will directly influence data processing across the entire company, impacting product engineering and analytics. You will partner with engineering leaders to drive technical direction and ensure systems meet high standards for performance and governance, with the opportunity to define the next generation of data processing at Robinhood. This role is based in the Bellevue, WA office, requiring in-person attendance at least 3 days per week. Robinhood believes in the power of in-person work to accelerate progress, spark innovation, and strengthen community, offering an intentional and energizing office experience designed for high-performing teams.

Requirements

  • Extensive experience with large-scale Spark and Databricks or similar platform infrastructure.
  • Deep expertise in data orchestration using Airflow for complex job lifecycle management.
  • Proven track record with lakehouse fundamentals, including S3-based data lakes and table/storage formats such as Delta Lake and Parquet.
  • Familiarity with query and serving infrastructure such as Trino, Pinot, or Hive Metastore.
  • Ability to own multi-team platform reliability, including cost optimization and developer experience initiatives.

Responsibilities

  • Design and build scalable platform primitives for Spark and Airflow to support Robinhood’s global data infrastructure needs.
  • Lead the migration and modernization of Spark workloads to serverless Databricks and Delta Lake architectures.
  • Optimize compute resource utilization and efficiency to manage costs across large-scale distributed systems.
  • Collaborate with internal teams across analytics and product engineering to deliver a seamless, self-serve data processing experience.
  • Improve platform reliability and governance by implementing advanced metadata management and access controls via Unity Catalog and Trino.

Benefits

  • Challenging, high-impact work to grow your career
  • Performance driven compensation with multipliers for outsized impact, bonus programs, equity ownership, and 401(k) matching
  • Top Tier benefits to fuel your work, including 100% paid health insurance for employees with 90% coverage for dependents
  • Access to the best AI tools on the market and continuous AI skill-building for every employee, technical or not
  • Lifestyle wallet - a highly flexible benefits spending account for wellness, learning, and more
  • Employer-paid life & disability insurance, fertility benefits, and mental health benefits
  • Time off to recharge including company holidays, paid time off, sick time, parental leave, and more!
  • Exceptional office experience with catered meals, events, and comfortable workspaces.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service