AWS Data & AI Engineer, Senior

Booz Allen HamiltonUsa, DC
13h

About The Position

AWS Data & AI Engineer, Senior The Opportunity: AI solutions only succeed when they are built on reliable data platforms and production-ready ML workflows. As an AWS Data and AI Engineer, you’ll enable intelligent, scalable systems by engineering the data pipelines and machine learning foundations that move models from experimentation to mission-ready deployment. In this role, you’ll focus on designing, building, and operationalizing machine learning solutions on AWS, with a strong emphasis on Amazon SageMaker. You’ll work alongside solution architects, data scientists, and application teams to deliver secure, scalable ML pipelines—supporting everything from data ingestion and feature engineering to model training, deployment, and monitoring in compliance-driven federal environments. This role is ideal for an engineer who enjoys hands-on development, building ML platforms, and growing into a senior AI or ML engineering role within AWS-centric ecosystems. Work with us and help build the future of AI-enabled systems in the Federal Government. Join us. The world can’t wait.

Requirements

  • 4+ years of experience as a data engineer, ML engineer, or software engineer working with data-driven or ML-enabled systems
  • Experience designing and operating end-to-end ML workflows using Amazon SageMaker, including SageMaker Studio or Notebooks, training jobs and hyperparameter tuning, managed model endpoints and batch inference, SageMaker Pipelines, Model Registry, and experiment tracking
  • Experience building data pipelines and feature engineering workflows using AWS services, such as S3, Glue, Redshift, EMR, Athena, or Lambda
  • Experience with Python development for data processing and ML workloads and SQL
  • Experience deploying and managing containerized ML workloads using Docker, ECR, and AWS-managed compute
  • Knowledge of ML frameworks and libraries commonly used with SageMaker, such as PyTorch, TensorFlow, scikit-learn, or XGBoost
  • Knowledge of MLOps concepts, including CI/CD for ML, model versioning, monitoring, and retraining
  • Ability to obtain and maintain a Public Trust or Suitability/Fitness determination based on client requirements
  • Bachelor’s degree in Computer Science, Engineering, or Data Science
  • Ability to obtain an AWS Certification, such as AWS Machine Learning – Specialty or AWS Solutions Architect – Associate, within 3 months of start date

Nice To Haves

  • Experience implementing production MLOps pipelines using SageMaker Pipelines, Step Functions, or CI/CD tools
  • Experience supporting FedRAMP or ATO-driven cloud environments
  • Experience operationalizing models developed by data scientists or research teams
  • Experience working with OpenAI models or APIs, including integrating large language models into applications, building prompt-based workflows, or supporting GenAI use cases
  • Experience working in Agile or DevSecOps teams
  • Knowledge of GenAI or foundation model workflows using SageMaker, such as JumpStart, managed foundation models, or custom LLM deployments
  • Knowledge of IAM, VPC networking, encryption, and security controls for ML workloads in regulated environments

Benefits

  • health
  • life
  • disability
  • financial
  • retirement benefits
  • paid leave
  • professional development
  • tuition assistance
  • work-life programs
  • dependent care
  • recognition awards program
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service