Booz Allen Hamilton-posted 1 day ago
$62,000 - $141,000/Yr
Full-time • Mid Level
Usa, VA
1-10 employees

Data Engineer, Mid The Opportunity Do you want to work at the forefront of advanced technology and solve complex data challenges? You know that data yields pivotal insights when it’s gathered from disparate sources and organized. As a data engineer, you have the chance to develop and deploy the pipelines and platforms that make this data meaningful. What's more, you'll have the chance to grow Booz Allen's DataOps capab ilities while working with a multi-disciplinary team of analysts, data engineers, data scientists, developers, and data consumers in a fast-paced, agile environment. We're looking for someone like you to ensure our clients meet their mission by impacting a results-driven national security organization. This is an opportunity to support data engineering activities on some of the most mission-driven projects in the industry. Driving innovation, you'll have the chance to architect data systems, stand up data platforms, build out ETL pipelines, write custom code, interface with data stores, perform data ingestion, and build data models. Perform analytical exploration and examination of data. Lead the assessment, design, building, and maintenance of scalable platforms. Guide your clients to solve their most pressing challenges. Ready to drive innovation using cutting-edge data tools and techniques? Join us. The world can’t wait.

  • Architect data systems
  • Stand up data platforms
  • Build out ETL pipelines
  • Write custom code
  • Interface with data stores
  • Perform data ingestion
  • Build data models
  • Perform analytical exploration and examination of data
  • Lead the assessment, design, building, and maintenance of scalable platforms
  • Guide your clients to solve their most pressing challenges
  • 3+ years of experience utilizing programming languages, including Python, SQL, and PySpark
  • 2+ years of experience with building, maintaining, and developing production data pipeline tools including Apache Airflow or Luigi
  • 2+ years of experience developing and maintaining scalable data stores that supply big data in forms needed for business analysis
  • Experience creating sof tware for retrieving, parsing and processing structured and unstructured data
  • Experience with developing scalable ETL / ELT workflows for reporting and analytics
  • Experience with secure data transfer protocols and working within and across government network environments, including NIPR, SIPR or JWICS
  • Experience creating solutions within a collaborative, cross-functional team environment
  • Ability to develop scripts and programs for converting various types of data into usable formats and support project team to scale, monitor and operate data platforms
  • TS / SCI clearance
  • Bachelor’s degree
  • Experience with distributed data and computing tools such as Spark, Databricks, Hadoop, Hive, AWS EMR, or Kafka
  • Experience with geospatial data formats, including Shapefile, GeoJSON, or KML
  • Experience with geospatial processing libraries, including GDAL, Geopandas, or PostGIS
  • Experience with a public cloud, including AWS, Micro sof t Azure, or Google Cloud
  • Experience with Version Control, including Git and collaborative development workflows
  • Experience working on real-time data and streaming applications
  • Experience with NoSQL implementation using MongoDB or Cassandra
  • Experience with data warehousing, including AWS Redshift, MySQL, or Snowflake
  • Experience with Agile engineering practices
  • TS/SCI clearance with a polygraph
  • health
  • life
  • disability
  • financial
  • retirement benefits
  • paid leave
  • professional development
  • tuition assistance
  • work-life programs
  • dependent care
  • recognition awards program
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service