Summation-posted about 24 hours ago
Full-time • Mid Level
Hybrid • Bellevue, WA
11-50 employees

Summation is building the future of business planning and analytics by bridging the gap between data and decision-making. We empower organizations to transform complex data into clear, actionable insights that drive business outcomes. Our AI-native platform integrates advanced data models with intuitive workflows, making enterprise performance management simple, collaborative, and effective. We're looking for Forward Deployed Data Scientists to work directly with our enterprise clients: understanding their business, wrangling their data, and building the systems that let them actually run their business with data. This is hands-on, high-impact work. You'll be embedded with enterprise clients, helping them go from "our data is a mess and everything lives in spreadsheets" to "we have systems that actually let us run the business." The interesting part isn't just solving one client's problem—it's figuring out how to solve it in a way that's repeatable, scalable, and increasingly automated. The work involves SQL and Python, but the real job is architecting processes that work. You're not just writing queries—you're building the systems and workflows that let us (and AI) do this work faster and more reliably for every client that comes after.

  • Work directly with client finance, analytics, and operations teams to understand their data and what they're trying to accomplish
  • Translate undocumented schemas and fragmented datasets into clean, structured data—and do it in a way that's repeatable with AI, not just a one-off
  • Build the analytical foundation that lets clients actually run their business with data (resource allocation, scenario planning, business reviews that produce decisions, not just slides)
  • Apply statistical methods and modeling to answer business questions and validate that the systems are working
  • As you solve problems for specific clients, extract reusable patterns and components. Your work compounds: the systems you build for one client become the starting point for the next five.
  • Help build the playbook and tooling that lets us onboard future clients faster
  • Contribute to our understanding of how to teach AI to do more of this work autonomously
  • Develop forecasting models and optimization systems that generalize across clients
  • Supervise teams of AI-powered agents to do data science work at scale—think of yourself as managing a squad of fast, capable (but imperfect) junior analysts
  • Re-engineer how data work gets done: what used to take two weeks should take two days, and what took two days should be automatic
  • Build the sanity checks and feedback loops that let you trust AI outputs—if something's off, you should know immediately
  • Continuously improve our workflows—kaizen for the AI era. Figure out what the AI can't do yet, teach it, and iterate
  • Strong SQL and data fundamentals. You can write complex queries, design schemas, and debug data issues. But more importantly, you understand data well enough to architect processes around it—not just execute tasks.
  • Production mindset. You don't just hand off a Jupyter notebook; you build systems that run reliably long after you've left the room.
  • Python and statistical fluency. You're comfortable with the modern data science stack and can apply statistical methods to real problems. You understand when a simple heuristic beats a complex model.
  • Experience with data modeling and financial/business metrics. You've built KPIs, dashboards, business reviews, or similar. You understand what a P&L is and aren't scared of accounting concepts like journal entries and allocations.
  • Product and business intuition. You can look at a business problem and figure out what needs to be built, not just how to build the schema. You could probably be a PM or a BizOps lead, but you chose to be technical because you like building things that work.
  • Comfort with ambiguity and client-facing work. You can talk to a VP of Finance, understand their problem, and translate it into a data solution. You don't need everything defined before you start.
  • AI fluency. You understand how to supervise AI—setting up feedback loops, verifying outputs, knowing when to trust it and when to dig in yourself.
  • High ownership and motivation. We care more about your drive than your pedigree. Performance = motivation × capability, and if you're motivated, you'll acquire whatever capabilities you're missing.
  • Experience at a growth-stage startup where you had to scale operations, build from scratch, or wear multiple hats
  • Familiarity with dbt, Snowflake, Airflow, or similar modern data stack tools
  • Prior work on pricing, marketplace dynamics, financial reporting, or resource allocation problems
  • Experience with forecasting, optimization, or reinforcement learning concepts (we're building systems that help businesses allocate resources dynamically)
  • Experimentation and causal inference background—A/B tests, pricing experiments, propensity matching. Knowing how to measure the impact of interventions, not just describe correlations.
  • Bayesian thinking—comfort with uncertainty, updating beliefs with data, and building models that reflect how the world actually works
  • Competitive salary and equity options
  • Remote-friendly with expectation of monthly travel to Bellevue and periodic client visits
  • Flexible (Unlimited) Paid Time Off
  • Medical, Dental, and Vision benefits for you and your family
  • 401(k) Plan
  • Parental Leave
  • Opportunities for growth and career development
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service