Data Engineer - Expert (TS/SCI with Poly REQUIRED)

General Dynamics Information TechnologyChantilly, VA
Onsite

About The Position

We are seeking an expert Data Engineer to design, build, and optimize scalable backend systems and data integration solutions. This role requires deep expertise across the full software development lifecycle—from system architecture through deployment—with a focus on data-intensive applications, performance optimization, and cross-platform integration. The ideal candidate thrives in dynamic environments and delivers high-impact solutions supporting mission-critical operations. Backend & System Development Design, develop, and maintain scalable backend systems using languages such as Java, Python, Go, or JavaScript frameworks (e.g., Angular) Architect and implement data integration solutions for both structured and unstructured data Develop and maintain RESTful APIs for real-time and batch processing Build modular, reusable software libraries with clean, well-defined interfaces Data Engineering & ETL Design, build, and optimize ETL pipelines using tools such as Elasticsearch, Kafka, and Airflow Ensure high performance, scalability, and reliability of data workflows Integrate externally developed code into core systems and libraries Cloud & DevOps Deploy and maintain applications in cloud environments (AWS, Azure, GCP) Implement containerization and orchestration using Docker and Kubernetes Develop and maintain CI/CD pipelines to support automated delivery Follow configuration management and infrastructure best practices Software Development Lifecycle Translate business requirements into technical solutions across data and application layers Maintain high code quality through testing, code reviews, and regression analysis Troubleshoot and resolve production issues to improve system stability and data integrity Conduct full lifecycle testing in accordance with established quality standards Application Support Provide bug fixes, enhancements, UI improvements, and access control updates Install, configure, and monitor applications to meet operational needs Deliver user support and ad hoc training as required Collaboration & Quality Assurance Collaborate with configuration managers to integrate internal and external solutions Evaluate and recommend third-party tools, vendors, and technologies Contribute to development standards, methodologies, and release readiness processes

Requirements

  • Active TS/SCI clearance with Polygraph (required)
  • Strong proficiency in one or more: Java, Python, Scala, Go, or JavaScript frameworks
  • Hands-on experience with Kubernetes and Airflow
  • Experience with relational and NoSQL databases, as well as cloud-native data services
  • Familiarity with CI/CD pipelines and Docker
  • Demonstrated ability to work across the full software development lifecycle
  • Strong analytical, troubleshooting, and communication skills
  • Bachelor’s degree in Computer Science or a related field (or equivalent combination of education, certifications, and experience)
  • 15+ years of relevant professional experience
  • US Citizenship Required

Nice To Haves

  • Experience building data products using Apache Avro
  • Experience with Apache ActiveMQ

Responsibilities

  • Design, develop, and maintain scalable backend systems using languages such as Java, Python, Go, or JavaScript frameworks (e.g., Angular)
  • Architect and implement data integration solutions for both structured and unstructured data
  • Develop and maintain RESTful APIs for real-time and batch processing
  • Build modular, reusable software libraries with clean, well-defined interfaces
  • Design, build, and optimize ETL pipelines using tools such as Elasticsearch, Kafka, and Airflow
  • Ensure high performance, scalability, and reliability of data workflows
  • Integrate externally developed code into core systems and libraries
  • Deploy and maintain applications in cloud environments (AWS, Azure, GCP)
  • Implement containerization and orchestration using Docker and Kubernetes
  • Develop and maintain CI/CD pipelines to support automated delivery
  • Follow configuration management and infrastructure best practices
  • Translate business requirements into technical solutions across data and application layers
  • Maintain high code quality through testing, code reviews, and regression analysis
  • Troubleshoot and resolve production issues to improve system stability and data integrity
  • Conduct full lifecycle testing in accordance with established quality standards
  • Provide bug fixes, enhancements, UI improvements, and access control updates
  • Install, configure, and monitor applications to meet operational needs
  • Deliver user support and ad hoc training as required
  • Collaborate with configuration managers to integrate internal and external solutions
  • Evaluate and recommend third-party tools, vendors, and technologies
  • Contribute to development standards, methodologies, and release readiness processes

Benefits

  • Our benefits package for all US-based employees includes a variety of medical plan options, some with Health Savings Accounts, dental plan options, a vision plan, and a 401(k) plan offering the ability to contribute both pre and post-tax dollars up to the IRS annual limits and receive a company match.
  • To encourage work/life balance, GDIT offers employees full flex work weeks where possible and a variety of paid time off plans, including vacation, sick and personal time, holidays, paid parental, military, bereavement and jury duty leave.
  • To ensure our employees are able to protect their income, other offerings such as short and long-term disability benefits, life, accidental death and dismemberment, personal accident, critical illness and business travel and accident insurance are provided or available.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service