Join GDIT’s Intelligence and Homeland Security (IHS) CTO organization and help drive the technical solutions needed to win our most complex and strategic deals. This role as highly technical AI/ML Delivery Engineer combines deep expertise in artificial intelligence, machine learning, data science, and modern software delivery with the strategic acumen of a solutions architect. The candidate will design, build, and optimize various implmentations of AI/ML learning solutions accorss cloud, hybrid, and edge environments. The ideal candidate bridges the gab between data science, software engineering, and systems architecture to enable the delivery of robust, scalable, and fit-for-mission AI solutions. How an AI/ML Delivery Engineer will make an impact: Solution Design and Architecture Architect end-to-end AI/ML systems, from data ingestion pipelines and feature stores to model training, evaluation, and deployment in production environments. Design distributed and scalable ML workflows leveraging cloud-native technologies (e.g., Kubernetes, Kubeflow, MLflow, SageMaker, Vertex AI, Azure ML, Nvidia ecosystem). Integrate MLOps principles, including CI/CD for ML, model versioning, and automated retraining pipelines. Ensure model governance, data lineage, and compliance with US Federal and State requirements. Collaborate with data scientists to develop and optimize models for computer vision, NLP, predictive analytics, and generative AI. Implement advanced model deployment strategies (e.g., ensemble serving, A/B testing, online learning) to linclude agile AI/ML Model Deployment Operations (ModelOps). Design APIs and microservices for AI model consumption across applications and external systems. Techincal Leadership: Serve as the AI/ML technical authority, guiding cross-functional engineering, data, and infrastructure teams. Conduct architecture reviews, performance benchmarking, and infrastructure optimizations for GPU/TPU clusters. Mentor data scientists and software engineers on best practices in AI/ML system design, algorithm selection, and responsible AI. Partner with data engineering teams to architect robust ETL/ELT pipelines and data lakehouse architectures. Define and implement strategies for real-time and batch data processing, data quality monitoring, and feature store management. Optimize AI and data workloads for cost efficiency and performance across compute clusters and cloud resources.