Sr. Technical Product Manager, AI Data Platforms

Data Direct NetworksSan Francisco - Remote, CA
3dRemote

About The Position

About the Opportunity The AI infrastructure market is being redefined. Enterprises across Financial Services, Health & Life Sciences, Retail, Manufacturing, and Sovereign AI programs are racing to connect their data to large language models, AI agents, and multimodal AI applications — and storage is at the center of that transformation. DDN is at the forefront of this shift. As a strategic partner in the NVIDIA ecosystem, DDN delivers Enterprise AI HyperPOD — a purpose-built, GPU-accelerated solution that makes enterprise generative AI real at scale. We are looking for a Sr. Technical Product Manager, AI Data Platforms who will own DDN's AIDP solution strategy end-to-end: shaping the product roadmap, defining differentiated solutions built on Enterprise AI HyperPOD, and driving go-to-market execution in partnership with NVIDIA, the broader NVIDIA ecosystem, and a growing network of technology and channel partners. This is a high-impact, high-visibility role for someone equally at home translating complex AI infrastructure architectures into compelling market solutions and working with engineering to define the features that make DDN's AI Data Platform offering best-in-class.

Requirements

  • 10+ years of product management experience, with at least 2 years in enterprise infrastructure, storage, or AI/ML infrastructure.
  • Solid understanding of AI/ML workflows — particularly Retrieval-Augmented Generation (RAG), LLM inference pipelines, and vector search.
  • Experience working within hardware/software ecosystem partner programs (NVIDIA, Intel, AMD, or similar) with demonstrated ability to manage multi-party product and GTM initiatives.
  • Demonstrated ability to drive cross-functional execution across engineering, sales, and marketing without direct authority.
  • Excellent written and verbal communication — you can write a PRD and present to a C-suite customer in the same day.

Nice To Haves

  • Familiarity with the NVIDIA AI Enterprise software stack: NIM microservices, NeMo Retriever, cuVS, or related components.
  • Experience with high-performance or parallel file systems, object storage, or NVMe-based storage architectures.
  • Experience with one or more of DDN's core verticals: Financial Services, Health & Life Sciences, Retail, Manufacturing, or Sovereign AI / government programs.
  • Familiarity with enterprise data governance, access control frameworks, and regulatory compliance requirements (HIPAA, FedRAMP, DORA, or similar).
  • Experience navigating and activating technology partner ecosystems — co-sell motions, joint solution development, and partner-led pipeline generation.
  • Experience with Kubernetes-based, cloud-native infrastructure deployments.

Responsibilities

  • Product Strategy & Roadmap Define and own the product roadmap for DDN Enterprise AI HyperPOD as it relates to the NVIDIA AI Data Platform reference architecture, aligning DDN's capabilities with AIDP requirements across hardware (NVIDIA Blackwell GPUs, BlueField DPUs, Spectrum-X networking) and software (NVIDIA AI Enterprise).
  • Identify and prioritize DDN-specific differentiation opportunities within the AIDP framework — including metadata monitoring and filtering, access controls, data governance and rollback features, high-performance storage protocols, enterprise data connectors, and vector database integrations.
  • Translate NVIDIA AIDP reference architecture specifications into concrete DDN product requirements across all supported storage topologies.
  • Ecosystem & Partner Engagement Serve as the primary product interface with NVIDIA for AIDP partner activities, aligning DDN's roadmap and certification milestones with NVIDIA's program requirements and design guide specifications.
  • Build and manage product relationships across the extended NVIDIA AI ecosystem — including ISVs, system integrators, cloud service providers, networking vendors, and complementary hardware partners — to develop joint solutions and validated reference architectures on Enterprise AI HyperPOD.
  • Engage with vector database partners, LLM and inference platform vendors, and enterprise data connector ecosystem participants to ensure DDN integrations are current, validated, and differentiated.
  • Represent DDN in NVIDIA partner programs, joint engineering reviews, and ecosystem advisory councils, ensuring DDN's voice and requirements are reflected in NVIDIA's evolving AIDP specifications.
  • Identify and develop co-sell and co-marketing opportunities with NVIDIA and ecosystem partners to accelerate DDN's pipeline in key verticals.
  • Solution Design & Enablement Define validated, customer-ready AIDP solution configurations on Enterprise AI HyperPOD — including GPU sizing guidance, network topology recommendations, and SLA-driven performance targets for continuous data ingestion and retrieval workloads.
  • Build solution briefs, reference architectures, and technical positioning materials covering the full AIDP pipeline: multimodal document ingestion (extraction, embedding, indexing) and retrieval (query embedding, reranking, RAG).
  • Ensure DDN solutions address the full enterprise data lifecycle — continuous ingestion, data drift prevention, semantic search accuracy, and data governance — meeting the rigorous compliance and security requirements of regulated industries.
  • Go-to-Market & Vertical Execution Collaborate with Sales, Solutions Engineering, and Marketing to build compelling narratives and win enterprise deals across DDN's target verticals.
  • Develop vertical-specific solution positioning for Financial Services, Health & Life Sciences, Retail, Manufacturing, and Sovereign AI programs — translating AIDP capabilities into outcomes that resonate in each market.
  • Represent DDN's AIDP solutions externally at industry events, with strategic customers, and with analyst and press audiences.
  • Voice of Customer & Market Intelligence Engage directly with enterprise customers and prospects to understand AI data infrastructure requirements, buying criteria, and deployment challenges specific to their industry.
  • Track the evolving GenAI infrastructure landscape — competing storage platforms, vector database ecosystems, LLM inference patterns, agentic AI deployment models, and sovereign AI regulatory requirements.
  • Feed market and ecosystem insights back into the Enterprise AI HyperPOD roadmap and partner strategy.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service