About The Position

Altamira Technologies Corporation has a long and successful history providing innovative solutions throughout the U.S. National Security community. Headquartered in McLean, Virginia, Altamira serves the defense, intelligence and homeland security communities by focusing on creating innovative solutions leveraging common standards in architecture, data and security. Altamira believes that our people and the culture of our company differentiate us from other companies. We focus on recruiting talented, self-motivated employees that strive to find a way to get things done. Altamira is seeking a Data Engineer to design, build, and operate high-performance data pipelines and event-driven systems supporting mission-critical platforms. This role focuses on implementing and managing Apache Kafka–based messaging architectures and integrating real-time data streams with cloud-native applications and analytics platforms. The ideal candidate brings strong experience in distributed systems, data streaming technologies, and cloud environments, and is comfortable working in secure, high-reliability environments.

Requirements

  • Active TS/SCI clearance
  • Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience)
  • Experience in data engineering, distributed systems, or backend engineering roles
  • Hands-on experience with Apache Kafka in production environments
  • Experience building and supporting real-time data pipelines
  • Strong proficiency in Java, Python, Scala, or similar programming languages
  • Experience working in AWS or hybrid cloud environments
  • Strong Linux systems administration and troubleshooting skills
  • Ability to work effectively in secure, mission-focused environments

Nice To Haves

  • Experience with Kafka Connect, Kafka Streams, or similar frameworks
  • Experience with stream processing platforms (Flink, Spark Streaming, etc.)
  • Experience with PostgreSQL, Redis, ArangoDB, or other data platforms
  • Experience with object storage systems such as MinIO or S3
  • Familiarity with Kubernetes-based deployments
  • Experience implementing data security and compliance controls
  • Prior experience supporting DoD or Intelligence Community programs

Responsibilities

  • Design, deploy, and operate Apache Kafka clusters in classified and hybrid environments
  • Build and maintain reliable, scalable, and secure data streaming pipelines
  • Develop and optimize producers, consumers, and stream processing applications
  • Configure and manage topics, partitions, replication, and retention policies
  • Monitor, tune, and troubleshoot Kafka performance, availability, and latency
  • Integrate streaming platforms with databases, storage systems, and analytics tools
  • Implement data governance, retention, and access control policies
  • Automate deployment and management of streaming infrastructure
  • Collaborate with platform, infrastructure, and application teams to support data requirements
  • Support system accreditation, compliance, and security requirements
  • Participate in architecture design and technical planning activities
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service