Technical Program Manager, Data Lake

AnsiraNorthampton, MA
17h

About The Position

Ansira is consolidating and scaling a unified, enterprise Data Lake to integrate product, media, and business data, standardize reporting, and accelerate decision-making across the organization. We are seeking a Technical Program Manager (TPM) to lead this cross-functional program end-to-end — aligning product and engineering roadmaps, driving ingestion and migration from legacy systems, maturing data governance and quality, and ensuring business adoption of standardized, self-serve analytics. This leader will orchestrate work across Ansira’s Product solutions, Data Engineering, Data Science/BI, Media, and Client Partnership teams, with a clear mandate: deliver consistent, governed, and performant data to downstream products and reporting while deprecating redundant systems and minimizing operational cost.

Requirements

  • 8+ years in Program/Project/Product Management, with 5+ years leading complex data platform initiatives in a cloud environment.
  • Proven delivery of cross-functional data programs involving multiple product lines and business stakeholders; strong executive communication.
  • Hands-on experience with modern data stacks: one or more of Snowflake/BigQuery/Databricks; Azure Data Factory/Airflow; dbt; Kafka/Kinesis; Git/Terraform; REST/SFTP integrations.
  • Strong grounding in data governance and quality practices, data contracts, catalog/lineage, and secure data access.
  • Demonstrated expertise in Agile at scale (Scrum/Kanban), Jira/Confluence, dependency/risk management, and budget tracking (including CAPEX/OPEX).
  • Competent SQL skills for validation/triage; fluency in reading pipeline/log artifacts and interpreting BI/semantic model requirements.

Nice To Haves

  • Background in marketing/media data and standardized performance reporting (e.g., Media (AdTech/LBN), campaign hierarchies, Power BI embedded).
  • Prior experience migrating from legacy ETL/BI ecosystems (e.g., Alteryx/Insighter/Tableau) to a lakehouse with standardized semantic layers.
  • Experience establishing data domains and productizing data (SLAs, contracts, versioning, lifecycle) to accelerate downstream analytics.
  • Familiarity with privacy, security, and compliance standards (e.g., RBAC/ABAC, PII governance) and enterprise SSO/permissions models for embedded analytics.
  • FinOps mindset: cost observability, unit economics, and right-sizing compute/storage.

Responsibilities

  • Own the multi-quarter program plan for the unified Data Lake: scope, roadmap, milestones, budgets (OPEX/CAPEX), risks, and dependencies.
  • Stand up and run the operating model: weekly workstream standups, cross-functional syncs, monthly steering committee, and a transparent executive status rhythm.
  • Build and maintain a single-source-of-truth for delivery: program charter, RACI, RAID log, decision log, intake/triage process, and dashboards for progress/risks.
  • Drive the migration plan from legacy pipelines and tools (e.g., Alteryx, Insighter) to the target stack (e.g., Snowflake, Power BI embedded via platform connectors).
  • Coordinate parallel workstreams (ingestion, modeling, governance, reporting cutover) to hit time-bound deliverables with predictable quality.
  • Define and maintain the Data Lake program backlog, translating business use cases into technical epics, data contracts, and acceptance criteria.
  • Partner with Product and Data Science teams to standardize media and product reporting packages and ensure they’re backed by governed, contract-driven data.
  • Prioritize sources and domains for ingestion based on business value, client impact, and technical feasibility; establish clear go/no-go gates.
  • Align with platform architecture to ensure scalable patterns for batch/stream ELT/CDC, cost control, observability, and reusability across domains.
  • Establish practical data contracts with upstream product and business owners; define schema, SLAs, lineage, and DQ checks at ingestion.
  • Stand up governance ceremonies and roles (data owners, stewards) and implement data catalog/lineage practices to improve discoverability and trust.
  • Define and monitor quality KPIs (completeness, timeliness, accuracy) and drive remediation plans with accountable teams.
  • Ensure data privacy, compliance, and security best practices (e.g., PII handling, role-based access, data masking) across environments.
  • Serve as the connective tissue across Product, Engineering, Data Science, Media, Finance, and Client Partnership — communicating decisions, trade-offs, and timelines.
  • Lead change management for reporting standardization (e.g., Media (AdTech/LBN)-based standard reports), business onboarding to the lake, and client-facing cutovers.
  • Create enablement assets (runbooks, playbooks, onboarding guides) and training plans to accelerate adoption and reduce support burden.
  • Partner effectively with architects and data engineers on Snowflake/BigQuery/Databricks, Azure/AWS/GCP services, orchestration (ADF/Airflow), and transformation (dbt).
  • Understand ELT/CDC patterns, API/file ingestion, schema design for analytics, and BI tooling (Power BI, Looker).
  • Write and review basic SQL for validation.
  • Apply FinOps and performance/cost optimization practices (storage tiers, compute sizing, job scheduling, caching strategies).
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service