Senior Confluent Kafka Lead

San R&D Business Solutions LLCWorthington, OH
Onsite

About The Position

The Confluent Kafka Lead / Python Developer is responsible for designing, building, and operating enterprise-grade event streaming solutions using Confluent Kafka while developing Python-based producers, consumers, and streaming applications. This role blends deep hands on development with technical leadership, ensuring scalable, reliable, and secure real-time data flows across distributed systems. The position plays a key role in event-driven architecture (EDA), data platform modernization, and real-time analytics initiatives.

Requirements

  • 6–10+ years of experience in software or data engineering.
  • 4+ years of hands-on experience with Apache Kafka and/or Confluent Platform.
  • Strong knowledge of Kafka internals (brokers, partitions, offsets, consumer groups).
  • Strong knowledge of Schema Registry and schema evolution.
  • Strong knowledge of Kafka Connect architectures and connectors.
  • Strong proficiency in Python for backend and streaming development.
  • Experience building production-grade services using Python frameworks and libraries.
  • Experience deploying Kafka and applications in cloud or hybrid environments (AWS, Azure, GCP).
  • CI/CD pipeline experience (GitHub Actions, Jenkins, GitLab, Azure DevOps).
  • Infrastructure-as-Code experience (Terraform, CloudFormation, ARM/Bicep).

Nice To Haves

  • Familiarity with async processing, multithreading, or stream processing patterns.
  • Containerization experience (Docker, Kubernetes).
  • Experience with ksqlDB, Kafka Streams, or stream processing frameworks (Flink, Spark Streaming).
  • Exposure to event sourcing or CQRS patterns.
  • Integration of Kafka with data lakes, warehouses, and analytics platforms.
  • Confluent or cloud platform certifications.
  • Experience supporting high-throughput, low-latency systems.
  • Strong communication skills across engineering and stakeholder teams.
  • Ability to translate business use cases into event-driven technical solutions.
  • Comfortable acting as both hands-on developer and technical lead.
  • Experience influencing architecture and standards across teams.

Responsibilities

  • Lead the design and implementation of enterprise Kafka and Confluent Platform solutions (Kafka, Schema Registry, Connect, ksqlDB).
  • Define and enforce topic design, partitioning, retention, and schema evolution standards.
  • Act as technical owner for Kafka clusters across dev, test, and production environments.
  • Drive best practices for high availability, fault tolerance, and scalability.
  • Design and develop Python-based Kafka producers and consumers using Confluent Kafka Python APIs.
  • Build event-driven microservices and streaming applications in Python.
  • Implement message serialization and schema validation (Avro, JSON, Protobuf).
  • Handle idempotency, retries, back-pressure, and error handling patterns.
  • Design event-driven integration patterns bridging microservices, data stores, APIs, and third party systems.
  • Integrate Kafka with downstream consumers such as databases, data lakes, analytics platforms, and search systems.
  • Support real-time pipelines for transactions, telemetry, customer events, and analytics.
  • Collaborate with API, data, and application teams to align event contracts.
  • Implement Kafka security controls: TLS encryption, SASL / OAuth authentication, ACL-based authorization.
  • Enforce data governance, schema compatibility rules, and event ownership models.
  • Ensure compliance with enterprise security and regulatory standards.
  • Build and maintain CI/CD pipelines for Kafka-related applications and configurations.
  • Use Infrastructure as Code to provision and manage Kafka infrastructure.
  • Implement monitoring and alerting using tools such as Confluent Control Center, Prometheus, Grafana, or cloud-native equivalents.
  • Troubleshoot production streaming issues related to latency, lag, throughput, or data loss.
  • Serve as Kafka subject matter expert and technical lead.
  • Mentor developers on event-driven design and streaming best practices.
  • Review designs and code for Kafka and Python-based streaming solutions.
  • Partner with architects, SREs, and platform teams on roadmap and capacity planning.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service