Booz Allen Hamilton-posted 1 day ago
Full-time • Mid Level
Hybrid • Reston, VA
1-10 employees

Data Engineer The Opportunity: As a Data Engineer, you will support the development and maintenance of scalable data stores that supply big data in forms needed for data analysis. You will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure to support our data-driven initiatives, including developing robust Extract, Transform, and Load ( ETL ) processes to integrate data from various sources into our data ecosystem, and designing and maintaining data models, schemas, and database structures to support analytical and operational use cases. You will collaborate closely with cross-functional teams to ensure the availability, reliability, and performance of our data systems and solutions. In this role, you’ll make a mission-forward impact as you sharpen your skill set and grow your career. Work with us as we shape systems for the better. Join us. The world can’t wait.

  • support the development and maintenance of scalable data stores that supply big data in forms needed for data analysis
  • designing, building, and maintaining scalable data pipelines and infrastructure to support our data-driven initiatives
  • developing robust Extract, Transform, and Load ( ETL ) processes to integrate data from various sources into our data ecosystem
  • designing and maintaining data models, schemas, and database structures to support analytical and operational use cases
  • collaborate closely with cross-functional teams to ensure the availability, reliability, and performance of our data systems and solutions
  • 3+ years of experience with the entire ETL pipeline, including data acquisition, data prep, and database architecture
  • 2+ years of experience using Java to build enterprise products and applications, including with Spring Boot
  • 2+ years of experience with Python and SQL
  • 2+ years of experience using Docker or Kubernetes for containerized environments
  • 2+ years of experience working with big data technologies and distributed computing processing, including Spark
  • Experience building real-time data processing applications
  • Experience with Git and GitLab CI / CD
  • TS / SCI clearance
  • Bachelor's degree
  • Ability to obtain a Security+ Certification within 6 months of hire date
  • Experience with Met adata Management such as Data Catalogs or Advana
  • Experience with Airflow pipeline orchestration
  • Experience with AI or ML pipelines and processes
  • Experience with back-end frameworks such as Spring Boot
  • Experience building real-time data processing applications using Kafka Streams
  • Possession of excellent communication skills
  • health, life, disability, financial, and retirement benefits, as well as paid leave, professional development, tuition assistance, work-life programs, and dependent care
  • recognition awards program acknowledges employees for exceptional performance and superior demonstration of our values
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service