Sr Kafka Platform Engineer

American Electric PowerColumbus, OH
Onsite

About The Position

At AEP, we’re more than just an energy company! We’re a team of dedicated professionals committed to delivering safe, reliable, and innovative energy solutions. Guided by our mission to put the customer first, we strive to exceed expectations by listening, responding, and continuously improving the way we serve our communities. If you're passionate about making a meaningful impact and being part of a forward-thinking organization, this is the company for you! This role is responsible for engineering and operating the enterprise Kafka platform, which is a strategic, on‑prem data movement backbone. The focus is on platform stability, scalability, data safety, and enablement of producer/consumer teams.

Requirements

  • Strong hands‑on Kafka experience in production environments.
  • Understanding of distributed systems, messaging semantics, and fault tolerance.
  • Experience with Kafka Deployments in Kubernetes Platform (OpenShift, Docker etc)
  • Experience with Multi-Region High Availability on-premise deployments
  • Experience with building producer/consumer applications using Kafka connectors
  • Experience building custom microservices in Python/Java
  • Experience with Kafka security (TLS, ACLs, RBAC).
  • Ability to troubleshoot complex, cross‑team data flow issues.

Nice To Haves

  • Experience with Schema Registry, Kafka Connect, MirrorMaker.
  • Familiarity with enterprise integration platforms and patterns (APIM, Camunda, n8n.io, webMethods, Oracle SOA or similar technologies).

Responsibilities

  • Develop Kafka producers, consumers, and stream processors using Java Spring Boot, Python, or Node.js.
  • Deploy and manage Kafka connectors (CDC, JDBC, IBM MQ, Redshift, etc.) in Confluent Kafka environments.
  • Containerize and deploy microservices and connectors on OpenShift using Docker, YAML, and Kustomize.
  • Implement CI/CD pipelines using GitHub Actions and Argo CD for automated deployments.
  • Perform basic Kafka administration tasks such as topic creation, partitioning, replication, and monitoring.
  • Collaborate with architects and business teams to understand integration requirements and deliver scalable solutions.
  • Monitor Kafka clusters and connector health using tools like Confluent Control Center, Prometheus, Grafana, Dynatrace, Splunk and OpenShift-native monitoring.
  • Build and maintain custom connectors for specialized data sources and sinks.
  • Support production deployments and troubleshoot issues across Kafka pipelines and OpenShift environments.
  • Clear understanding of cloud managed services including AWS, Azure cloud to deploy Confluent connectors for data integrations
  • Support real-time data streaming between on-prem systems and cloud/SaaS platforms.
  • Work closely with business teams, architects, and product owners to understand integration requirements.
  • Coordinate with co-sourced/offshore teams for development and deployment tasks.
  • Conduct code reviews, quality checks, and ensure adherence to standards.
  • Ensure reliability and scalability of Kafka pipelines by applying 12-Factor principles and microservices architecture.
  • Collaborate with architects and platform teams to align Kafka solutions with enterprise integration patterns and cloud strategies.
  • Automate deployment, monitoring, and operational tasks wherever possible.
  • Own integration deliverables from design through production deployment.
  • Act as Subject Matter Expert (SME) for Kafka-based integrations and cluster administration.

Benefits

  • competitive compensation
  • unique comprehensive benefits package that aims to support and enhance the overall well-being of our employees
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service