About The Position

Snowflake is about empowering enterprises to achieve their full potential — and people too. With a culture that’s all in on impact, innovation, and collaboration, Snowflake is the sweet spot for building big, moving fast, and taking technology — and careers — to the next level. We are looking for a highly analytical and technically strong FinOps Analytics Consultant with a quantitative or data science background. In this role, you will use advanced SQL and Python to analyze large-scale datasets, model cloud consumption behaviors, and create data-driven insights for Snowflake customers. Snowflake platform expertise and FinOps skills are not required — we will train you. What matters most is your ability to work with complex data, derive insights, think in unit economics, and clearly communicate findings to business and technical stakeholders.

Requirements

  • 3–5+ years of experience in data science, quantitative analysis, analytics engineering, or applied statistics roles.
  • Strong SQL skills — ability to query, aggregate, model, and interpret large datasets.
  • Strong Python skills (pandas, numpy, data modeling, exploratory analysis).
  • Solid understanding of statistics, experimentation, time series, optimization techniques, or benchmarking.
  • Experience working with large-scale datasets from data warehouses, cloud environments, or analytics platforms.
  • Ability to translate complex data into simple unit economics and business insights.
  • Experience preparing clear executive-level narratives, dashboards, or insights reports.
  • Strong critical thinking and structured problem solving.
  • Experience presenting findings to non-technical stakeholders.
  • Ability to break complex concepts into simple explanations.
  • Strong ownership mentality, curiosity, and willingness to learn specialized Snowflake tooling.

Nice To Haves

  • Snowflake workload architecture
  • FinOps principles and cloud economics
  • Cloud computing cost models (AWS/GCP/Azure)
  • Query performance tuning concepts
  • Data platform performance engineering

Responsibilities

  • Analyze large-scale consumption, workload, and performance datasets to uncover insights, trends, and optimization opportunities.
  • Build unit economic models such as: cost per query cost per TB scanned cost per user efficiency benchmarks across workloads
  • Explore and interpret internal metadata pipelines created by Product/Data Science teams to understand compute, storage, and pipeline behaviors.
  • Translate quantitative findings into clear, actionable insights that help customers reduce waste and improve ROI.
  • Develop reusable analytical frameworks for consumption modeling and workload optimization.
  • Partner with account teams, Sales Engineers, and Value Engineers to support customer conversations with data-backed recommendations.
  • Build customer-facing deliverables using Jupyter notebooks, dashboards, and executive-ready PowerPoint narratives.
  • Continuously refine analytical approaches to improve accuracy, benchmarking quality, and scale of FinOps engagements.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service