CACI International-posted 3 months ago
$103,800 - $218,100/Yr
Full-time • Mid Level
5,001-10,000 employees

CACI is seeking a Kafka Engineer to join our team and support the Border Enforcement Applications for Government Leading-Edge Information Technology (IT) (BEAGLE) contract. You will have the opportunity to apply your knowledge, skills and experience to building a truly modern application that is new development and cloud native. If you thrive in a culture of innovation and bring creative ideas to solve complex technical and procedural problems at the team and portfolio levels, then this opportunity is for you! Join this passionate team of industry-leading individuals supporting best practices in agile software development for the Department of Homeland Security (DHS). You will support the men and women charged with safeguarding the American people and enhancing the nation’s safety and security.

  • Serve as an Agile Scrum team member providing software development support and maintenance for the delivery of releasable software in short sprint cycles.
  • Responsible for activities associated with delivery of software solutions associated with customer-defined systems and software projects by working in close collaboration with software developers/engineers, stakeholders, and end users within Agile processes.
  • Design, develop, and deploy high-performance Kafka producers, consumers, and stream processing applications (using Kafka Streams, ksqlDB, Flink, or Spark Streaming) in Java.
  • Collaborate with architects and other engineering teams to define and evolve our event-driven architecture, ensuring best practices for Kafka topic design, partitioning, replication, and data retention.
  • Implement and manage components of the Kafka ecosystem, including Kafka Connect (source and sink connectors), Schema Registry (Avro, Protobuf), and Kafka security features.
  • Monitor, troubleshoot, and optimize Kafka clusters and Kafka-dependent applications for throughput, latency, reliability, and resource utilization.
  • Build and maintain robust and resilient data pipelines for real-time ingestion, transformation, and distribution of data across various systems.
  • Provide operational support for Kafka-based systems, including incident response, root cause analysis, and proactive maintenance to ensure high availability and reliability.
  • Enforce data contract definitions and schema evolution strategies using Schema Registry to maintain data quality and compatibility across services.
  • Implement comprehensive testing strategies for Kafka applications, including unit, integration, and end-to-end tests, ensuring data integrity and system reliability.
  • Create and maintain detailed technical documentation, architectural diagrams, and operational runbooks for Kafka-related components and processes.
  • Act as a subject matter expert, sharing knowledge, mentoring junior engineers, and championing Kafka best practices across the organization.
  • Must be a U.S. Citizen with the ability to pass CBP background investigation.
  • Extensive hands-on experience designing, developing, and deploying applications using Apache Kafka (producers, consumers, topic management, consumer groups).
  • Deep understanding of Kafka's internal architecture, guarantees (at-least-once, exactly-once), offset management, and delivery semantics.
  • Experience with Kafka Streams API or other stream processing frameworks (e.g., Flink, Spark Streaming with Kafka).
  • High-level proficiency in at least one modern backend programming language suitable for Kafka development (Java strongly preferred).
  • Strong understanding of distributed systems principles, concurrency, fault tolerance, and resilience patterns.
  • Experience with data serialization formats such as Avro, Protobuf, or JSON Schema, and their use with Kafka Schema Registry.
  • Solid understanding of relational and/or NoSQL databases, and experience integrating them with Kafka.
  • Excellent analytical, debugging, and problem-solving skills in complex distributed environments.
  • Strong verbal and written communication skills, with the ability to clearly articulate technical concepts to diverse audiences.
  • Knowledge of monitoring and observability tools for Kafka and streaming applications (e.g., Prometheus, Grafana, ELK stack, Datadog).
  • Working knowledge of Git and collaborative development workflows.
  • Understanding of all elements of the software development life cycle, including planning, development, requirements management, CM, quality assurance, and release management.
  • At least seven (7) years related technical experience, with software design, development and implementation in a Windows Environment.
  • College degree (B.S.) in Computer Science, Software Engineering, Information Management Systems or a related discipline. Equivalent professional experience will be considered in lieu of degree.
  • Hands-on experience with Confluent Platform components (Control Center, ksqlDB, REST Proxy, Tiered Storage).
  • Experience with Kafka Connect for building data integration pipelines (developing custom connectors is a plus).
  • Familiarity with cloud platforms (AWS, Azure, GCP) and managed Kafka services (e.g., AWS MSK, Confluent Cloud, Azure Event Hubs).
  • Experience with containerization (Docker) and orchestration (Kubernetes) for deploying Kafka-dependent applications.
  • Experience with CI/CD pipelines for automated testing and deployment of Kafka-based services.
  • Familiarity with performance testing and benchmarking tools for Kafka and related applications.
  • Healthcare
  • Wellness programs
  • Financial benefits
  • Retirement plans
  • Family support
  • Continuing education
  • Time off benefits
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service