Lead Data Engineer

OptiverChicago, IL
48d

About The Position

Optiver Chicago is seeking a seasoned Data Engineer to support and advance the data capabilities of our local research team and contribute to our broader Lakehouse architecture vision. In this role, you'll work with cutting-edge tools like Databricks, Apache Spark, and Delta Lake, translating experimental research workflows into scalable, production-grade systems. As a critical engineering presence in Chicago, you'll directly influence local data strategy while collaborating with our Global Research and Data Platform teams. Your work will drive real-time insights, enable predictive analytics, and power decision-making across the trading lifecycle. This role combines hands-on data engineering, cross-regional collaboration, and platform innovation-all within a high-performance, data-driven environment.

Requirements

  • 5+ years of hands-on experience in data engineering, delivering robust pipelines at scale
  • Advanced Python skills and deep experience with Apache Spark and the Databricks platform
  • Familiarity with Delta Lake, streaming data systems (e.g., Kafka), and distributed compute environments
  • Solid understanding of cloud-native data architectures (preferably AWS) and infrastructure cost optimization principles
  • Proficiency in relational databases (e.g., PostgreSQL) and modern orchestration tools
  • Proven ability to lead projects independently and deliver outcomes in fast-paced environments
  • Clear communicator who collaborates well with researchers, traders, and engineers alike
  • Enthusiastic mentor and strong advocate for engineering rigor and platform scalability
  • Bachelor's or Master's degree in Computer Science, Engineering, or a related technical discipline

Nice To Haves

  • Experience with system-level languages (e.g., C++, Rust) and exposure to MLOps or MLflow

Responsibilities

  • Design, build, and maintain reliable ETL/ELT pipelines using Spark, Structured Streaming, Databricks, and our in-house high-performance tools
  • Optimize and productionize research workflows with a strong focus on scalability, resilience, and performance tuning
  • Collaborate with power users to develop and share reusable patterns, templates, and onboarding pathways
  • Define, document, and enforce data engineering best practices
  • Mentor junior engineers and drive a culture of continuous learning and DataOps excellence

Benefits

  • Immediate impact on the data systems powering world-class research and real-time trading decisions.
  • A unique opportunity to shape Lakehouse engineering in the United States and influence global data architecture.
  • High autonomy to own complex workflows and template "how-to" solutions across teams.
  • Close collaboration with quant researchers and traders to unlock predictive insights and trading alpha.
  • Partnership with best-in-class engineers across Chicago, Amsterdam, and Sydney.
  • The opportunity to work alongside best-in-class professionals from over 40 different countries
  • 401(k) match up to 50%
  • Comprehensive health, mental, dental, vision, disability, and life coverage
  • 25 paid vacation days alongside market holidays
  • Extensive office perks, including breakfast, lunch and snacks, regular social events, clubs, sporting leagues and more

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Industry

Securities, Commodity Contracts, and Other Financial Investments and Related Activities

Number of Employees

1,001-5,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service