Senior Data Engineer / Data Architect

QodeCalifornia City, CA
Onsite

About The Position

We are seeking an experienced Senior Data Engineer / Data Architect with deep expertise in financial services and capital markets trading systems to design, build, and operate highly scalable, low latency data platforms. This role focuses on creating modern data lakehouse architectures, real time data pipelines, and analytics ready data models that support trading, risk, reporting, and regulatory requirements. The ideal candidate brings strong technical depth in distributed data systems, hands on experience with streaming and batch pipelines, and a solid understanding of trade lifecycle, market data, and regulatory compliance. You will work closely with trading desks, product teams, AI/analytics teams, and architects to deliver reliable, secure, and high performance data solutions.

Requirements

  • 7–12+ years of experience in data engineering, backend engineering, or distributed systems
  • Strong programming expertise in Python, Scala, and/or Java
  • Advanced SQL skills
  • Hands-on experience with distributed data processing frameworks such as Apache Spark or Apache Flink
  • Extensive experience with streaming platforms including Kafka, Kinesis, or Pulsar
  • Hands-on experience with data lake and data warehouse technologies such as Databricks, Snowflake, Amazon Redshift, or similar platforms
  • Proven experience building real-time or near real-time data pipelines
  • Strong understanding of data modeling, distributed systems, scalability, and performance optimization

Nice To Haves

  • Experience in Wealth Management or Capital Markets trading systems
  • Familiarity with OMS/EMS platforms such as Charles River Development (CRD), Aladdin, FIS, or similar systems
  • Strong knowledge of market data across equities, fixed income, derivatives, and other asset classes
  • Understanding of the trade lifecycle, including order capture, execution, allocation, clearing, and post-trade processing
  • Experience with cloud-native data platforms on AWS, Azure, or GCP
  • Exposure to real-time analytics, risk management systems, and regulatory reporting platforms

Responsibilities

  • Design, build, and operate highly scalable, low latency data platforms.
  • Create modern data lakehouse architectures.
  • Develop real-time data pipelines.
  • Build analytics-ready data models that support trading, risk, reporting, and regulatory requirements.
  • Work closely with trading desks, product teams, AI/analytics teams, and architects to deliver reliable, secure, and high-performance data solutions.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service