Agentic AI Data Engineer

Dynatron SoftwareRichardson, TX
17d$140,000 - $180,000Remote

About The Position

Dynatron is transforming the automotive service industry with intelligent SaaS solutions that deliver measurable results for thousands of dealership service departments. Our proprietary analytics, automation capabilities, and AI-powered workflows empower service leaders to increase profitability, elevate customer satisfaction, and operate with greater efficiency. With accelerating demand and a rapidly expanding product ecosystem, we’re scaling fast, and we’re just getting started. The Opportunity We’re looking for a hands-on and forward-thinking AI Data Engineer to build and operationalize agentic AI systems that combine modern data engineering, LLM orchestration, and automation frameworks. In this high-impact role, you will design the pipelines, retrieval systems, and infrastructure that allow AI agents to reason, take action, and deliver real-time intelligence using enterprise data. As a core member of our AI & Data Engineering team, you’ll work with cutting-edge platforms, including AWS Bedrock, LangChain, Snowflake, and vector databases, to develop AI-driven workflows that enhance automation across the business. This is a rare opportunity to help shape the foundation of Dynatron’s AI strategy while contributing directly to product innovation and operational excellence.

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
  • 5+ years of data engineering experience, including 2+ years building AI-integrated data systems.
  • Hands-on expertise with: AWS Bedrock, SageMaker, and Lambda LangChain or LlamaIndex Snowflake, Redshift, or Databricks Python, SQL, and API integrations Vector databases (Pinecone, FAISS, Chroma)
  • Familiarity with RAG pipelines, LLM function calling, and prompt optimization techniques.
  • Experience integrating enterprise data with LLMs from Anthropic, OpenAI, or Meta.
  • Strong understanding of data modeling, ETL orchestration, and MLOps best practices.

Nice To Haves

  • Experience with multi-agent frameworks such as CrewAI, AutoGen, or Bedrock Agents.
  • Knowledge of data observability tools like Monte Carlo, DataHub, or Marquez.
  • Familiarity with Docker, Kubernetes, and CI/CD automation.
  • Relevant industry or academic certifications, including: AWS Certified Machine Learning – Specialty Google Cloud Generative AI Engineer MIT/Stanford AI & ML Certifications DeepLearning.AI LLM Applications Certificate

Responsibilities

  • Design and maintain scalable data pipelines and ingestion frameworks powering advanced AI and agent workflows using AWS Glue, Lambda, Step Functions, and Kinesis.
  • Prepare and optimize structured and unstructured data for retrieval-augmented generation (RAG) and LLM-enabled use cases.
  • Integrate vector databases such as Pinecone, Chroma, Amazon OpenSearch, or FAISS for semantic retrieval.
  • Automate context curation, data transformations, and memory persistence to support dynamic prompt construction and autonomous agent behavior.
  • Build and deploy autonomous AI agents using LangChain, LlamaIndex, or AWS Agents for Bedrock.
  • Develop multi-agent workflows capable of reasoning, tool usage, and decision-making through secure function calling.
  • Integrate AI agents with enterprise systems including Snowflake, Redshift, Databricks, Salesforce, JIRA, and ServiceNow.
  • Create pipelines that enable agents to execute SQL, APIs, and document searches with full governance and security.
  • Operationalize Bedrock-hosted models (Anthropic Claude, Amazon Titan, Llama 3, and others) into data engineering workflows.
  • Build embedding pipelines and feature stores supporting intelligent retrieval and semantic search.
  • Implement CI/CD for AI services using GitHub Actions, Airflow, or AWS CodePipeline.
  • Monitor model accuracy, hallucination rates, and system reliability through automated evaluation.
  • Apply data security, masking, and lineage controls to all AI-enabled pipelines and retrieval systems.
  • Ensure Responsible AI practices, such as transparency, fairness, auditability, are embedded throughout the AI stack.
  • Maintain metadata, logging, and observability for all AI and data engineering workflows.

Benefits

  • Competitive base salary
  • Participation in Dynatron’s Equity Incentive Plan
  • Comprehensive health, vision, and dental insurance
  • Employer-paid short- and long-term disability and life insurance
  • 401(k) with competitive company match
  • Flexible vacation policy and 9 paid holidays
  • Remote-first culture
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service