Mid-Level AI Data Engineer

Global C2 Integration Technologies
1d

About The Position

Global C2 Integration Technologies is looking for a talented and enthusiastic Mid-Level AI Data Engineer to support our growing AI, Data, and Automation portfolio in support of federal and DoD missions. This role serves as a technical cornerstone within project teams, applying strong data engineering fundamentals to enable generative AI, large language models (LLMs), and agentic AI solutions. The ideal candidate is hands-on, mission-focused, and comfortable operating in cloud-native environments.

Requirements

  • S. Citizenship with eligibility to obtain and maintain a DoD Secret clearance.
  • Bachelor's or Master's degree in Computer Science, Data Science, AI/ML, Engineering, or a related discipline.
  • 3-7 years of hands-on experience delivering data engineering, AI, or ML solutions.
  • Strong understanding of ETL/ELT processes and enterprise data platforms, including experience with technologies such as:
  • OpenSearch / Elasticsearch
  • Kafka
  • AWS Bedrock, Glue, S3, RDS, EBS, Glacier
  • Databricks, Snowflake
  • Experience with vector databases, embeddings, and associated data structures, file formats, APIs, and services (e.g., FAISS, PGVector, OpenSearch/Elasticsearch, Pinecone, Hugging Face, Bedrock Knowledge Bases).
  • Practical experience working with generative AI, LLMs, tool calling (MCP), and agent-based architectures (A2A).
  • Familiarity with leading AI platforms and APIs (OpenAI, Anthropic, Gemini, AWS Bedrock, Google Vertex AI).
  • Experience using CI/CD pipelines (GitHub, GitLab, Jenkins).
  • Proficiency with modern development environments (VS Code) and AI-assisted coding tools (e.g., Cline, Claude Code).
  • Strong written and verbal communication skills with a customer-focused mindset.

Nice To Haves

  • Demonstrated curiosity and willingness to experiment with emerging AI capabilities in pursuit of real mission impact.
  • Strong Python proficiency and experience with libraries/frameworks such as:
  • PySpark, Pandas, uv, Pydantic, FastAPI
  • LangChain, LangGraph, CrewAI, Unstructured
  • Experience with Infrastructure as Code tools (Terraform, OpenTofu, AWS CDK, CloudFormation).
  • Experience with agentic AI frameworks and platforms (Agent2Agent Protocol, AWS Bedrock Agents, Mastra, CrewAI, Strands, AgentCore).
  • Exposure to adjacent disciplines including data science, cloud engineering, platform engineering, or UI/UX development.
  • Familiarity with federal cybersecurity requirements and frameworks (RMF, FedRAMP, NIST).

Responsibilities

  • Design, build, and optimize data pipelines to ingest, clean, normalize, and structure data for generative AI, LLMs, and Retrieval-Augmented Generation (RAG) use cases.
  • Enable reliable, secure access to structured and unstructured data sources supporting AI workflows.
  • Manage and organize large-scale datasets across cloud platforms (AWS, Azure, GCP).
  • Implement and maintain medallion architectures (Bronze/Silver/Gold) to ensure data quality, lineage, governance, and accessibility.
  • Work with SQL and NoSQL databases to model, query, and load large datasets at scale.
  • Monitor, tune, and maintain high-performance data stores supporting analytics, AI workloads, and reporting.
  • Collaborate closely with data engineers, software engineers, AI engineers, and data scientists to develop operational, agentic AI systems.
  • Participate in technical design reviews and cross-functional solutioning.
  • Support automated deployment pipelines using Infrastructure as Code (IaC), CI/CD frameworks, and containerized environments.
  • Contribute to repeatable, secure, cloud-native deployment patterns.
  • Monitor AI-enabled systems post-deployment.
  • Perform performance tuning and apply best practices for scalability, reliability, and availability.
  • Develop clean, maintainable, and well-documented code aligned with industry best practices and federal standards.
  • Support reproducible development and operational transparency.
  • Stay current on emerging AI, data engineering, and automation technologies.
  • Actively learn from senior engineers and AI architects through mentorship and technical collaboration.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service