Consolidated Data Analytics Platform Brokering Engineer

Booz Allen HamiltonColumbus, OH
Remote

About The Position

Consolidated Data Analytics Platform Brokering Engineer The Opportunity: Ever-expanding technology and collection methodologies means that there is more structured and unstructured data available today than ever before. As a data engineer, you know that organizing big data can yield pivotal insights when it is gathered from disparate sources. We need a Data Engineer who is experienced in upgrading and maintaining Kafka clusters in Kubernetes in AWS, as well as utilizing Kafka Schema Registry and Kafka Security Manager (KSM) to manage schema evolution and security. In this role, you'll use your expertise in designing, developing, and deploying Kafka clusters in a cloud environment, and with Kafka Schema Registry and Kafka Security Manager. Here, you will guide and mentor data engineers, developers, and data consumers in a fast-paced, agile environment, and will oversee the assessment, design, building, and maintenance of scalable platforms for your clients. Join us. The world can't wait.

Requirements

  • Experience with Kafka or Confluent in a containerized environment
  • Experience with Apache NiFi in a containerized environment
  • Experience creating data partitioning strategies and monitoring topics for performance
  • Experience deploying and upgrading Kafka clusters in high availability containerized environments
  • Experience utilizing observability platforms, including Prometheus, Grafana, or Elastic, to configure monitoring for data pipelines to ensure high availability and throughput, low latency, and alerting
  • Knowledge of stream processing pipelines and analytics
  • Secret clearance
  • HS diploma or GED

Nice To Haves

  • Experience with the Confluent for Kubernetes (CFK) Operator
  • Experience with scripting in Bash or Python
  • Experience deploying and maintaining applications, appliances, or machines aligned to DoD Security Technical Implementation Guidelines (STIGs) and Security Requirements Guides (SRGs)
  • Experience writing playbooks and scripts for automation tools, including Terraform, Ansible, or Puppet, for Infrastructure-as-Code (IaC) and Configuration-as-Code (CaC)
  • Experience in DoW, Intelligence Community, or other regulated environments
  • Knowledge of cybersecurity concepts, including threats, vulnerabilities, security operations, encryption, boundary defense, auditing, authentication, and supply chain risk management
  • Knowledge of Zero Trust Architecture (ZTA) principles

Responsibilities

  • Designing, developing, and deploying Kafka clusters in a cloud environment, and with Kafka Schema Registry and Kafka Security Manager.
  • Guiding and mentoring data engineers, developers, and data consumers in a fast-paced, agile environment.
  • Overseeing the assessment, design, building, and maintenance of scalable platforms for clients.

Benefits

  • health, life, disability, financial, and retirement benefits
  • paid leave
  • professional development
  • tuition assistance
  • work-life programs
  • dependent care
  • recognition awards program
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service