Lead Agentic Data Systems Engineer

SalesforceSan Francisco, CA
Remote

About The Position

Salesforce is seeking a Lead Agentic Data Systems Engineer to join the Enterprise Data & AI Solutions group. This role is for a hands-on, depth-first engineer who will take architectural blueprints and turn them into production-grade data products. The engineer will own the product end-to-end, building, maintaining, and enhancing it. This role redefines the data team model by managing "hand-off" protocols between specialized AI agents, acting as the central anchor for a hybrid human-agent intelligence unit.

Requirements

  • Production-grade proficiency in Python, dbt, Airflow, and advanced SQL.
  • Experience with Apache Spark, and Snowflake.
  • Fluency in AI-native development environments (e.g., Cursor, Codex, or Claude Code).
  • Expert in Prompt Engineering.
  • Mastery of agentic frameworks such as LangGraph.
  • Expert-level knowledge of chain-of-thought prompting, self-correction loops, and iterative reasoning paths.
  • Salesforce Core and Data 360 understanding.
  • Advanced understanding of Data Mesh, Data-as-a-Product (DaaP), and Event-Driven Architectures.
  • Understanding of Semantic layer and Knowledge Graphs.
  • Experience using agentic workloads via Docker, Kubernetes, and serverless compute environments.
  • 5+ years of experience in high-stakes Data Engineering, Architecture, or Data Science.
  • A documented history of using generative AI to accelerate personal and departmental output by orders of magnitude.
  • The ability to function as a "Domain Data Officer," managing end-to-end data strategy for a business unit with minimal supervision.
  • Superior analytical judgment—the ability to identify subtle logic errors or hallucinations in agentic output before they reach production.

Responsibilities

  • Architect and maintain a private ecosystem of 10+ autonomous agents specialized in ETL, synthetic data generation, automated QA, and predictive modeling.
  • Design multi-step reasoning architectures and verification protocols to ensure agents autonomously validate and peer-review their own outputs.
  • Transform high-level, ambiguous business requirements into production-ready data products independently.
  • Use domain knowledge to ensure deployed tools are well governed, implementing governance as code for data pipelines and Agentic development.
  • Develop and maintain Model Context Protocol (MCP) servers to provide agents with secure, deep-link access to Snowflake, Salesforce, AWS, and proprietary internal data catalogs.

Benefits

  • time off programs
  • medical
  • dental
  • vision
  • mental health support
  • paid parental leave
  • life and disability insurance
  • 401(k)
  • employee stock purchasing program
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service