Data Engineer

Booz Allen HamiltonFayetteville, NC
Remote

About The Position

Rapid advances in IoT, machine learning, and artificial intelligence mean organizations have access to more data than ever before—both structured and unstructured. Turning that data into actionable insight requires strong engineering, thoughtful design, and scalable platforms. As a senior Data Engineer at Booz Allen, you will design and build data platforms and pipelines that power mission-critical outcomes. You will work on complex, high-impact programs that help clients solve real-world challenges with data. You’ll collaborate with analysts, AI engineers, developers, and stakeholders in a fast-paced, agile environment. In this role, you will lead technical efforts across assessment, design, development, and sustainment of scalable data platforms while mentoring teammates and driving best practices.

Requirements

  • 2+ years of experience as a data engineer supporting large-scale enterprise systems
  • Experience writing clean, secure, and efficient Python for data engineering use cases
  • Experience moving data from on-prem environments to the cloud and automating development workflows, including building, deploying, or managing applications
  • Experience with CI/CD practices
  • Experience with SQL, including stored procedures and data modeling
  • Experience with data governance and data tagging
  • Knowledge of AWS services, such as S3, IAM, EventBridge, Step Functions, or Lambda
  • Ability to work independently and manage tasks with minimal supervision
  • TS/SCI clearance
  • HS diploma or GED

Nice To Haves

  • Experience with Pipeline Builder, AIP, and Foundry's application development ecosystem
  • Experience with ETL tools, such as dbt and Airflow
  • Experience with NoSQL and graph databases
  • Experience designing and building data warehouses
  • Experience with DevOps tools and automation practices
  • Experience working in multiple SDLC models, including Agile, Waterfall, Iterative, or Spiral
  • Experience with open table formats, such as Apache Iceberg
  • Experience with infrastructure-as-code, including Terraform or CloudFormation
  • Knowledge of data lakes and lakehouse architectures and database performance concepts, such as partitioning
  • Knowledge of Zero Trust and attribute-based access control (ABAC)

Responsibilities

  • Design, build, and maintain scalable data pipelines and platforms.
  • Develop secure, efficient Python for data processing, automation, and warehousing.
  • Migrate and integrate data from on-premises systems to cloud environments.
  • Implement CI/CD pipelines to automate build, test, and deployment workflows.
  • Develop SQL scripts and stored procedures for data processing and transformation.
  • Apply data modeling best practices.
  • Support data governance, metadata tagging, and access controls.
  • Partner with cross-functional teams to deliver end-to-end data solutions.
  • Provide technical guidance in a complex enterprise environment.

Benefits

  • health benefits
  • life benefits
  • disability benefits
  • financial benefits
  • retirement benefits
  • paid leave
  • professional development
  • tuition assistance
  • work-life programs
  • dependent care
  • recognition awards program

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Senior

Education Level

High school or GED

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service