Data Engineering Manager

Geneva TradingChicago, IL
Onsite

About The Position

This is a hands-on engineering leadership role where you will own the platforms that capture, normalize, store, and distribute market data. You will spend a significant portion of your time writing production code, designing data systems, and solving complex engineering problems alongside a team of 2-3 engineers. The ideal candidate leads by building, not by delegating from a distance. Success in this role involves understanding the data stack, shipping improvements to pipelines, building trust with trading and research teams, driving the technical roadmap for market data infrastructure, improving reliability and performance, and owning the market data platform end-to-end.

Requirements

  • 7+ years of hands-on data engineering or market data infrastructure experience — you are an active, practicing engineer who writes production code regularly.
  • 3+ years leading engineering teams while remaining deeply technical — your references can speak to your code contributions as well as your leadership.
  • Expert-level KDB+/Q — you write complex Q, optimize tick plant performance, and can debug production HDB issues independently.
  • Strong, production-quality Python — well-tested, packaged, maintainable systems-level code, not scripting or glue.
  • Verifiable experience building low-latency market data decoders for real exchange protocols — you will personally own decoder development from day one.
  • Strong grasp of network-level market data concepts: multicast, packet capture, sequencing, and gap detection.
  • Fluent with Linux performance tools (perf, strace, tcpdump, numactl) and comfortable tuning systems at that level.

Nice To Haves

  • Background in high-frequency trading, market making, or proprietary trading firms.
  • Proficiency in C or C++ for performance-critical decoder and capture components.
  • Experience with kernel-bypass or high-performance networking technologies.
  • Experience with streaming platforms used in real-time data pipelines.
  • Working knowledge of binary market data encoding standards.
  • Contributions to open-source data tooling or quantitative research infrastructure.

Responsibilities

  • Own the end-to-end market data pipeline — from multi-venue ingestion through normalization to near-real-time delivery — with a focus on correctness, resilience, and recoverability.
  • Integrate direct feed capture alongside third-party vendor data.
  • Build replay, recovery, and gap-detection capabilities.
  • Ensure sequencing, validation, and data availability at the speed the business requires.
  • Design and operate KDB+/Q platforms for real-time and historical market data, supporting analytical workflows for trading and research.
  • Optimize schema design, partitioning strategies, and query performance.
  • Manage data retention, lifecycle policies, and long-term maintainability.
  • Design scalable, reliable data delivery to downstream consumers using streaming and messaging technologies.
  • Define data contracts and schemas that downstream teams can depend on.
  • Balance real-time delivery with durability and replayability.
  • Build the internal tooling and shared libraries that make your team and others more productive.
  • Develop validation, monitoring, replay, and analytics tools.
  • Own supporting systems for reference data, configuration, and metadata.
  • Lead your team through hands-on contribution, design reviews, and code reviews. Set engineering standards by example.
  • Partner with trading and research teams to understand their data needs and translate them into platform improvements.
  • Own the reliability of core data systems, including debugging production issues during market hours.
  • Define monitoring, alerting, and data quality observability.

Benefits

  • Eligibility for a performance-based bonus.
  • Competitive total rewards package.
  • Comprehensive benefits program.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service