Data Science Engineer

ALICE + OLIVIANew York, NY
Hybrid

About The Position

Alice + Olivia is hiring a Decision Science Engineer to build, operationalize, and own predictive models that improve decision making across Planning, Merchandising/Buying, Customer 360, and Inventory Management. Working primarily within Palantir Foundry as the core analytics and deployment platform, this role bridges data science and platform engineering, translating real-world commercial constraints including seasonality, product lifecycle, allocations, replenishment, promotions/markdowns, and customer behavior into practical, production-grade, human-in-the-loop models. This is a highly cross-functional role with direct, ongoing exposure to business stakeholders and end users, from planners and buyers who consume and act on model outputs, to the teams working within Foundry Workshop applications day-to-day. Understanding how outputs are used in real workflows, and staying close to the people using them, is central to how this role operates. Sitting within the IT organization and serving the full business including DTC, wholesale, planning, and supply chain, this position carries direct visibility to executive business stakeholders. This role is focused on applied data science, decision-support modeling, and model operationalization, not reporting or core data engineering, while partnering closely with Engineering, BI, and the broader Enterprise Applications team.

Requirements

  • 4 to 8+ years of experience in data science, applied analytics, or predictive modeling.
  • Strong experience building predictive or statistical models including forecasting, optimization, regression, and classification.
  • Proficiency in Python for data analysis, modeling, and production-grade code.
  • Strong SQL and hands-on experience with Snowflake or equivalent cloud data warehouse.
  • Experience with Palantir Foundry strongly preferred, including familiarity with the Foundry Ontology layer, analytics and modeling tooling (Code Workbook, Code Repository), and how model outputs are surfaced through business-facing applications. Candidates who can reason within and contribute to an existing Ontology, and who understand how the semantic data layer connects to model design, will be prioritized.
  • Familiarity with experiment design and causal inference.
  • Experience with time-series forecasting at scale.
  • Retail domain experience strongly preferred, including familiarity with demand vs net sales, sell-through, markdowns, planning vs buying workflows, seasons, price groups, and store vs e-commerce dynamics.
  • Experience working with messy, real-world data and designing models that support business judgment.
  • Ability to reason about semantic or ontology-based data models.
  • Strong communication and relationship skills, comfortable working directly with planners, buyers, and business stakeholders, and able to translate model outputs into decisions, not just insights.
  • Long-term ownership mindset for maintaining and evolving models over time.
  • Hands-on experience building or contributing to Palantir Foundry Workshop applications.
  • Background in wholesale or multi-channel retail, including familiarity with the operational differences between DTC, wholesale, and brick-and-mortar planning cycles.
  • Exposure to agent-based or AI-assisted analytics workflows.
  • Experience working within a formal data governance or steerco-driven prioritization structure.
  • Familiarity with Snowflake-native development patterns including dbt, Snowpark, or similar.

Responsibilities

  • Design, build, and maintain predictive and statistical models supporting retail planning, buying, inventory, and demand forecasting.
  • Operationalize models within Palantir Foundry using Code Workbook and Code Repository, working against Snowflake as the primary data source and ensuring outputs are production-grade and maintainable.
  • Own and evolve analytics logic and assumptions embedded in planning and buying workflows.
  • Partner with Planning, Buying, Merchandising, and DTC teams to translate business needs into model-driven insights, maintaining direct, ongoing relationships with the stakeholders and end users consuming model outputs.
  • Collaborate with the Enterprise Applications team on Workshop app development to ensure model outputs are surfaced effectively in business-facing Foundry applications.
  • Build models for customer segmentation, CLV, retention/churn, propensity, and campaign targeting (Customer 360).
  • Collaborate with Engineering, who owns data pipelines and Snowflake ingestion, to ensure model inputs are reliable, well-governed, and fit for purpose without owning the underlying data infrastructure.
  • Support human-in-the-loop workflows where planners and buyers can interrogate, override, or adjust model outputs in real time.
  • Validate model outputs against business reality and historical performance, closing the loop directly with end users on model accuracy and relevance.
  • Document methodologies and assumptions to support long-term ownership and institutional knowledge.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service