Sr Kafka Platform Engineer

American Electric PowerColumbus, OH

About The Position

This role is responsible for engineering and operating the enterprise Kafka platform, which is a strategic, on‑prem data movement backbone. The focus is on platform stability, scalability, data safety, and enablement of producer/consumer teams.

Requirements

  • Strong hands‑on Kafka experience in production environments.
  • Understanding of distributed systems, messaging semantics, and fault tolerance.
  • Experience with Kafka Deployments in Kubernetes Platform (OpenShift, Docker etc)
  • Experience with Multi-Region High Availability on-premise deployments
  • Experience with building producer/consumer applications using Kafka connectors
  • Experience building custom microservices in Python/Java
  • Experience with Kafka security (TLS, ACLs, RBAC).
  • Ability to troubleshoot complex, cross‑team data flow issues.
  • Bachelor's degree in computer science, engineering, or related technical field is required.
  • 8 years of relevant work experience is required.
  • An equivalent combination of education and related experience may be considered.

Nice To Haves

  • Experience with Schema Registry, Kafka Connect, MirrorMaker.
  • Familiarity with enterprise integration platforms and patterns (APIM, Camunda, n8n.io, webMethods, Oracle SOA or similar technologies).
  • Confluent Certified Administrator for Apache Kafka® (CCAAK)
  • Confluent Certified Developer for Apache Kafka® (CCDAK)
  • Confluent Certified Cloud Operator (CCAC)

Responsibilities

  • Develop Kafka producers, consumers, and stream processors using Java Spring Boot, Python, or Node.js.
  • Deploy and manage Kafka connectors (CDC, JDBC, IBM MQ, Redshift, etc.) in Confluent Kafka environments.
  • Containerize and deploy microservices and connectors on OpenShift using Docker, YAML, and Kustomize.
  • Implement CI/CD pipelines using GitHub Actions and Argo CD for automated deployments.
  • Perform basic Kafka administration tasks such as topic creation, partitioning, replication, and monitoring.
  • Collaborate with architects and business teams to understand integration requirements and deliver scalable solutions.
  • Monitor Kafka clusters and connector health using tools like Confluent Control Center, Prometheus, Grafana, Dynatrace, Splunk and OpenShift-native monitoring.
  • Build and maintain custom connectors for specialized data sources and sinks.
  • Support production deployments and troubleshoot issues across Kafka pipelines and OpenShift environments.
  • Clear understanding of cloud managed services including AWS, Azure cloud to deploy Confluent connectors for data integrations
  • Support real-time data streaming between on-prem systems and cloud/SaaS platforms.
  • Work closely with business teams, architects, and product owners to understand integration requirements.
  • Coordinate with co-sourced/offshore teams for development and deployment tasks.
  • Conduct code reviews, quality checks, and ensure adherence to standards.
  • Ensure reliability and scalability of Kafka pipelines by applying 12-Factor principles and microservices architecture.
  • Collaborate with architects and platform teams to align Kafka solutions with enterprise integration patterns and cloud strategies.
  • Automate deployment, monitoring, and operational tasks wherever possible.
  • Own integration deliverables from design through production deployment.
  • Act as Subject Matter Expert (SME) for Kafka-based integrations and cluster administration.

Benefits

  • $116,255.00 - $151,132.50 Base Salary from $116,255.00 - $151,132.50 /year.
  • AEP offers a unique comprehensive benefits package that aims to support and enhance the overall well-being of our employees.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service