Artificial Intelligence Engineer

Stefanini GroupDearborn, MI
Onsite

About The Position

Stefanini Group is hiring a Senior AI & Data Engineer to bridge the gap between scalable data architecture and Generative AI. This role is designed for an engineer who views data as a product and possesses a proven track record of building production-grade AI-powered applications and agentic workflows. Unlike traditional data modeling roles, you will be responsible for architecting the underlying systems that power the data ecosystem, integrating LLMs, RAG (Retrieval-Augmented Generation) solutions, and autonomous agents into enterprise-scale platforms on Google Cloud (GCP). Stefanini Group is a global provider of offshore, onshore and near shore outsourcing, IT digital consulting, systems integration, application, and strategic staffing services to Fortune 1000 enterprises around the world, with a presence in the Americas, Europe, Africa, and Asia, serving over four hundred clients across various markets. Stefanini is a CMM level 5, IT consulting company.

Requirements

  • 7+ experience in relevant field (Senior Specialist Exp).
  • 6–8 years in Software or Data Engineering, with a strong emphasis on Python-based application development.
  • 2–4 years of hands-on experience building with LLMs, GPT-based APIs, or Vector Databases.
  • Fine Tuning LLMs.
  • Extensive experience with GCP (preferred), including BigQuery, Vertex AI, and Cloud Run, Postgres.
  • Proven ability to design OLTP and OLAP patterns and document complex information flows.
  • Expert proficiency in Python, SQL, and Terraform.
  • Experience with MCP (Model Context Protocol), RAG, LangChain, Google Agent Development Kit (ADK).
  • Rest API development using Flask/Flask API.
  • Bachelor's Degree.

Responsibilities

  • Architect and deploy agentic AI workflows and RAG solutions.
  • Move beyond simple prompting to build systems capable of multi-step reasoning and tool-use.
  • Design and implement high-performance analytical data products using streaming (Pub/Sub, Dataflow) and batch patterns, ensuring data is "AI-ready."
  • Own and scale cloud infrastructure using Terraform.
  • Prioritize security, cost-optimization, and automated environment provisioning.
  • Build robust pipelines that monitor not just data quality, but AI model performance, latency, and drift in production.
  • Operate within an Agile environment using Test-Driven Development (TDD).
  • Maintain rigorous code quality standards via SonarQube, Checkmarx, and Cycode integrated with CI/CD pipelines and GitHub Actions.
  • Implement enterprise-grade data protection and sharing models.
  • Map complex data lineages to ensure compliance and traceability.
  • Provide Tier 3 support for production AI services, ensuring uptime and performance according to established SLAs.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service