Data Architect - Market (Manager)

Huron Consulting ServicesChicago, IL
Remote

About The Position

Huron helps its clients drive growth, enhance performance and sustain leadership in the markets they serve. We help healthcare organizations build innovation capabilities and accelerate key growth initiatives, enabling organizations to own the future, instead of being disrupted by it. Together, we empower clients to create sustainable growth, optimize internal processes and deliver better consumer outcomes. Health systems, hospitals and medical clinics are under immense pressure to improve clinical outcomes and reduce the cost of providing patient care. Investing in new partnerships, clinical services and technology is not enough to create meaningful and substantive change. To succeed long-term, healthcare organizations must empower leaders, clinicians, employees, affiliates and communities to build cultures that foster innovation to achieve the best outcomes for patients. Joining the Huron team means you’ll help our clients evolve and adapt to the rapidly changing healthcare environment and optimize existing business operations, improve clinical outcomes, create a more consumer-centric healthcare experience, and drive physician, patient and employee engagement across the enterprise. Join our team as the expert you are now and create your future. This role sits within a strategic investment to embed AI into how we operate, serve customers, and make decisions within our healthcare business. We're building a healthcare-wide AI data and context platform with a focus on deep domain expertise embedded throughout our architecture. Our goals are: Turn structured and unstructured information into trusted, reusable "building blocks" (semantic layers, retrieval services, and agent-ready interfaces) that accelerate product innovation Deliver transformational speed and leverage — faster time-to-insight, higher automation of knowledge work, and a foundation that scales AI safely and reliably as adoption grows Unlock new capabilities across our business and create the foundation that drives deeper domain innovation and cross-domain collaboration This is a hands-on technical architect who owns the design and delivery of core AI/context data capabilities. The role is responsible for end-to-end architecture decisions across the platform — unstructured ingestion, embeddings, retrieval, semantic layers, and governance — while partnering across engineering, product, and AI teams to ship production-grade AI data products. Leadership is through technical ownership, design authority, and cross-functional influence. This role is expected to grow into direct people leadership over time. As the platform matures and the engineering team expands, the Architect will take on formal responsibility for leading a small team of engineers — owning hiring input, technical development, and delivery oversight. Candidates should be comfortable with that trajectory and motivated by the opportunity to build and shape a team from an early stage.

Requirements

  • 8–12+ years in data engineering, data architecture, or platform roles with significant hands-on delivery
  • Expert SQL and strong Python (or Scala/Java); deep production engineering habits
  • Hands-on Snowflake expertise including advanced data modeling, pipeline design, performance tuning, and operating at scale in production
  • Proven experience designing cloud data architectures on AWS, Azure, or GCP — including storage, compute , orchestration, and networking considerations
  • Hands-on experience with vector search and embeddings (pgvector /Pinecone/ Weaviate /OpenSearch/Elastic) and retrieval patterns (semantic retrieval, hybrid search, reranking)
  • Experience with dbt or comparable semantic layer tooling in a production environment
  • Demonstrated ability to lead cross-functional technical initiatives and drive alignment across teams
  • Strong written and verbal communication skills — able to present architecture decisions to both technical and non-technical audiences

Nice To Haves

  • Experience supporting LLM applications (RAG, agent tool interfaces, evaluation/observability)
  • Knowledge of knowledge graphs, semantic modeling, or metrics layers at scale
  • Experience in regulated environments and mature data governance programs
  • Familiarity with Iceberg, Delta Lake, or other open table formats in a lakehouse context
  • Prior experience in a formal or informal technical lead or staff engineer capacity

Responsibilities

  • Architect and own the AI context platform
  • Design end-to-end platform architecture: ingestion → parsing/chunking → enrichment → embeddings → vector indexing → retrieval/serving
  • Define scalable patterns for incremental refresh, backfills, re-embeddings, deduplication, and lineage across unstructured sources
  • Set technical direction for retrieval quality (query strategies, hybrid search, metadata filtering, reranking) in partnership with AI engineers
  • Evaluate and select infrastructure, tooling, and cloud services to support platform needs across AWS/Azure/GCP environments
  • Design and deliver semantic and governed data products
  • Architect and implement semantic layers (metrics/entities) that power BI and agent reasoning consistently across the platform
  • Define data contracts and context contracts for AI inputs (schemas, metadata requirements, freshness, citation expectations)
  • Establish standards for discoverability, documentation, and reusability across datasets and indexes
  • Own the dbt or semantic layer tooling strategy and ensure consistent application across workstreams
  • Own reliability and performance at the platform level: monitoring, alerting, SLAs/SLOs, runbooks, incident response, and postmortems
  • Drive cost and latency optimization across Snowflake, lakehouse , and vector infrastructure
  • Set engineering standards for CI/CD, testing, and evaluation (retrieval eval sets, regression tests, online telemetry)
  • Implement security-by-design: RBAC/ABAC patterns, PII redaction, retention controls, audit logging, and safe access pathways for agent tools
  • Partner with Security/Legal/Compliance to define and enforce guardrails for AI access to enterprise knowledge
  • Own governance patterns for sensitive data handling across the platform
  • Drive technical roadmap decomposition with product, AI, and application stakeholders
  • Facilitate architectural decisions across teams and functions, building alignment without direct authority
  • Set best practices and mentor engineers via design reviews, code reviews, and documentation

Benefits

  • medical, dental and vision coverage
  • other wellness programs
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service