Sr. Data Scientist, AI Delivery/Deployment

AppFolioSanta Barbara, CA
10d

About The Position

Hi, We’re AppFolio We’re innovators, changemakers, and collaborators. We’re more than just a software company – we’re pioneers in cloud and AI who deliver magical experiences that make our customers’ lives easier. We’re revolutionizing how people do business in the real estate industry, and we want your ideas, enthusiasm, and passion to help us keep innovating. We believe ML and AI are powerful tools—but not universal solutions. This role exists to ensure we deploy AI deliberately, responsibly, and only where it meaningfully improves internal outcomes. Success of this role is measured by the adoption of the tools built - production ML/AI systems that drive measurable business outcomes (eg, cost reduction, efficiency gains). We’re aiming for adoption, data democratization, and business enablement. You will determine the most effective path forward—whether that involves leveraging existing AI Factory infrastructure or building bespoke systems from scratch when business logic requires it. You will advocate for "AI where it adds value," ensuring we avoid high-cost, low-impact complexity in favor of robust, scalable solutions that move the needle for our stakeholders. Your impact Deploy End-to-End ML/AI : Lead the design and delivery of custom ML/AI workflows, ranging from classical regression and classification models to agentic LLM systems. Strategic Partnership: Navigate ambiguity by partnering with product, business, and engineering stakeholders to translate complex business challenges into concrete ML/AI roadmaps. Drive ML Explainability & Narrative Insights : Translate complex "black-box" model predictions and feature importance into human-readable narratives. You will build the translation layer that makes sophisticated data science insights accessible and actionable for executive stakeholders. Operationalize Value-Focused Observability : Design and implement observability frameworks to track the full lifecycle of deployments, monitoring business ROI and model health Standardize Infrastructure & Engineering : Establish reusable code libraries, modeling frameworks, and orchestration standards to accelerate model delivery

Requirements

  • Technical Prowess – You possess the mathematical depth to build sophisticated models and the software engineering rigor to deploy them.
  • Deployment Mindset – You don't consider a project "done" when the notebook is finished; you thrive on the challenge of getting models into the hands of users and keeping them running at scale.
  • Business Acumen – You understand key challenges facing our business and partner with stakeholders to find creative ways to apply AI to solve them.
  • Communication & Storytelling – You can translate complex technical concepts into compelling narratives for non-technical stakeholders. You are comfortable presenting to leadership, justifying AI investments with ROI, and setting the vision of AI initiatives across the company.
  • Efficiency – Able to quickly iterate on data generation and refinement. Looks for ways to improve processes to maximize efficiency and remove redundancy.
  • Custom LLM Workflow Experience: Proven track record of customizing and deploying LLM workflows for specific, non-generic use cases.
  • Classical ML Mastery: Advanced knowledge of classical machine learning techniques such as regression and classification
  • Bachelor’s Degree in a STEM field and minimum 6 years of experience in a related field.
  • Cloud Services Expertise: Demonstrated experience working with and deploying models within major cloud environments (AWS, Azure, GCP, or Snowflake).
  • Data Translation: Deep understanding of Feature Importance (SHAP/LIME) and how to map these values to semantic context.
  • Strong programming background (Python, SQL, version control, system design) with experience writing production-grade, modular code.
  • Effective Communication: Strong listening and interpersonal skills; ability to communicate with cross-functional partners in both technical and business terms.

Nice To Haves

  • Snowflake Proficiency: Hands-on experience navigating the broader Snowflake Data Cloud ecosystem (Snowpark, Cortex, Streamlit)
  • Vector Database Experience: Familiarity with vector search architectures or managed vector search offerings (e.g., Pinecone, Weaviate, Cortex Search).
  • Model Context Protocol (MCP): Familiarity with MCP for creating standardized, interoperable connections between LLMs and data sources or proprietary tools.
  • Experience building AI Agents for automated code-refactoring or SQL generation.
  • Experience using dbt (data build tool) or orchestration frameworks such as Airflow.

Responsibilities

  • Lead the design and delivery of custom ML/AI workflows, ranging from classical regression and classification models to agentic LLM systems.
  • Navigate ambiguity by partnering with product, business, and engineering stakeholders to translate complex business challenges into concrete ML/AI roadmaps.
  • Translate complex "black-box" model predictions and feature importance into human-readable narratives.
  • Design and implement observability frameworks to track the full lifecycle of deployments, monitoring business ROI and model health
  • Establish reusable code libraries, modeling frameworks, and orchestration standards to accelerate model delivery

Benefits

  • Regular full-time employees are eligible for benefits - see here
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service