Senior Event Streaming Engineer

BDIPlusNew York, NY

About The Position

BDIPlus is seeking a Senior Event Streaming Engineer to support a Fortune 100 financial services client’s enterprise data and AI transformation. You will help build an event-driven intelligence platform. This role sits at the intersection of real-time data engineering, streaming architecture, and AI enablement. You’ll design scalable pipelines that convert raw system activity into high-value, real-time insights used by analytics platforms and AI agents. Your work will directly influence how the business understands and engages with its customers.

Requirements

  • 5+ years of hands-on experience with Apache Kafka (topic design, partitioning, scaling, consumer group management)
  • Strong expertise in Kafka Streams (stateful processing, windowing, exactly once semantics)
  • Experience with Schema Registry (Confluent or AWS Glue) and schema evolution (Avro/Protobuf)
  • Proven experience with Kafka Connect and S3 Sink Connector; familiarity with Lakehouse patterns (Apache Iceberg preferred)
  • Experience with monitoring and observability (JMX, consumer lag analysis, ELK dashboards)
  • Hands-on experience with CDC tools (e.g., Debezium, Informatica IDMC CDC, or equivalent)
  • Strong programming skills in SQL and at least one of: Python, Java, or Scala
  • Familiarity with AWS services (S3, Glue, Athena, Step Functions)
  • Comfort working with AI-assisted development tools (Claude Code experience preferred; training available)

Nice To Haves

  • Experience working with mainframe data (COBOL copybooks, VSAM, AIX/AS400)
  • Background in financial services or insurance
  • Hands-on experience with Apache Iceberg (partitioning, schema evolution)
  • Familiarity with Denodo or semantic data layer tools
  • Experience with infrastructure-as-code (Terraform or CloudFormation)
  • Consulting or client-facing experience, including knowledge transfer responsibilities

Responsibilities

  • Instrument administration systems to capture and emit key customer lifecycle events into Apache Kafka
  • Design and build Kafka-to-Iceberg data pipelines using S3 Sink Connector with exactly once delivery guarantees
  • Develop real-time behavioral aggregates (7/30/90-day windows) using Kafka Streams with stateful processing and windowed joins
  • Configure and manage Schema Registry (Confluent or AWS Glue), enforcing BACKWARD_TRANSITIVE compatibility
  • Implement resilient DLQ (Dead Letter Queue) patterns with ordering-safe replay and escalation workflows
  • Build and maintain observability frameworks (ELK, Prometheus) for pipeline health, consumer lag, and alerting
  • Partner with AI/Agent engineering teams to deliver production-ready streaming data via MCP endpoints
  • Develop CDC pipelines for legacy and modern systems (mainframe, AIX/AS400, and policy admin platforms)
  • Participate in client demos and contribute to knowledge transfer with internal client teams
  • Leverage AI-assisted development tools (Claude Code / Codex) to accelerate delivery
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service