Booz Allen Hamilton-posted 2 months ago
$62,000 - $141,000/Yr
Mid Level
Alexandria, VA
5,001-10,000 employees
Professional, Scientific, and Technical Services

Ever-expanding technology like IoT, machine learning, and artificial intelligence means that there's more structured and unstructured data available today than ever before. As a data engineer, you know that organizing data can yield pivotal insights when it's gathered from disparate sources. We need a data professional like you to help our clients find answers in their data to impact important missions-from fraud detection to cancer research to national intelligence. As a data engineer at Booz Allen, you'll use your skills and experience to help build advanced technology solutions and implement data engineering activities on some of the most mission-driven projects in the industry. You'll develop and deploy the pipelines and platforms that organize and make disparate data meaningful. Here, you'll work with a multi-disciplinary team of analysts, data engineers, developers, and data consumers in a fast-paced, agile environment. You'll sharpen your skills in analytical exploration and data examination while you support the assessment, design, developing, and maintenance of scalable platforms for your clients. Work with us to use data for good.

  • Design, develop, and maintain scalable data pipelines for ingestion, processing, and distribution of large datasets
  • Implement data engineering best practices including data quality monitoring, testing, and documentation
  • Manage secure data transfers across government networks while ensuring compliance with security protocols
  • Process, analyze, and optimize geospatial datasets for various analytical and operational use cases
  • Collaborate with data scientists, data visualization experts, and subject matter experts to understand data requirements and deliver solutions
  • Monitor pipeline performance, troubleshoot issues, and implement optimizations
  • Maintain data governance standards and ensure data lineage documentation
  • Support data architecture decisions and contribute to technical design reviews
  • 2+ years of experience in Python and SQL
  • Experience with PySpark, Java, Scala, or Go
  • Experience building and maintaining production data pipelines to support high-volume, real time processing
  • Experience with secure data transfer protocols and working within and across government network environments, such as NIPR, SIPR, or JWICS
  • Knowledge of ETL/ELT processes, data modeling, data warehousing concepts, and data quality frameworks
  • TS/SCI clearance
  • Bachelor's degree in a Science, Technology, Engineering, or Mathematics (STEM) field
  • Experience working in the Databricks environment, including coding and managing workflows
  • Experience with geospatial data formats, such as Shapefile, GeoJSON, and KML, coordinate systems, and geospatial processing libraries such as GDAL, Geopandas, and PostGIS
  • Experience with Git and collaborative development workflows
  • Experience with AWS, Azure, or GCP data services, such as S3, RDS, or BigQuery
  • Experience with Docker and Kubernetes for pipeline deployment
  • Experience working in Agile or Scrum development environments
  • Experience with streaming data processing and real-time analytics
  • Experience with tools like Qlik, Tableau, PowerBI, or open-source alternatives
  • Knowledge of Terraform, CloudFormation, or similar tools
  • Knowledge of ML pipeline development and MLOps practices
  • Health benefits
  • Life insurance
  • Disability insurance
  • Financial benefits
  • Retirement benefits
  • Paid leave
  • Professional development
  • Tuition assistance
  • Work-life programs
  • Dependent care
  • Recognition awards program
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service