Analytics Engineer II

CoreWeaveSunnyvale, CA
2h$122,000 - $179,000Hybrid

About The Position

The Analytics Engineering Team turns data into impact. We build and maintain CoreWeave’s enterprise semantic layer, which serves as the foundation for decision-making, AI, and operational insight across the company. By shipping modeled, documented, and governed data products we drop the barrier to entry for business intelligence and data science through the floor empowering users across the business to move faster and smarter. About the role: We’re hiring an Analytics Engineer II to help build the last mile of analytics at CoreWeave. In this role, you’ll contribute to our enterprise semantic layer—transforming raw data into trusted models, metrics, and analytics-ready datasets used by teams across the company. You’ll work across the analytics stack, using tools like dbt, Airflow, Spark, StarRocks, Tableau, and Omni to deliver reusable data models, dashboards, and analytics products that support reporting, self-service analytics, and emerging AI use cases. You’ll partner with data engineering, finance, product, and operations to implement shared business logic and ensure data is clear, consistent, and decision-ready. This is a hands-on role for someone who wants to grow their impact, deepen their analytics engineering skills, and help shape how CoreWeave uses data.

Requirements

  • Experience working in fast-paced, complex data environments, contributing to analytics solutions that support real business use cases.
  • 3+ years of experience in Analytics Engineering, Data Engineering, or Business Intelligence.
  • 2+ years of hands-on experience modeling analytics-ready data using dbt with SQL and/or Python.
  • Strong SQL skills, with experience writing, debugging, and optimizing analytical queries.
  • Experience querying MPP analytical databases such as StarRocks, Snowflake, BigQuery, or Redshift.
  • Experience with one or more modern BI tools (e.g., Omni, Tableau, Looker, Power BI), including building dashboards, metrics, and semantic models that support self-serve analytics.
  • Hands-on experience orchestrating or contributing to data workflows using Airflow, Dagster, or equivalent tools.
  • Working proficiency in at least one scripting language (e.g., Python, Bash, R, or Julia).
  • Ability to translate analytical requirements into well-modeled datasets, metrics, and visualizations used by business stakeholders.

Nice To Haves

  • Experience contributing to production data pipelines or data platforms, with exposure to software engineering best practices such as testing, CI/CD, and code review.
  • Familiarity with stream processing or data transport systems (e.g., Kafka, Flink), including a basic understanding of how they are used in production data systems.
  • Familiarity with open table formats such as Iceberg, Hudi, Delta, or Paimon, and an understanding of their role in lakehouse architectures and analytics workflows.

Benefits

  • Medical, dental, and vision insurance - 100% paid for by CoreWeave
  • Company-paid Life Insurance
  • Voluntary supplemental life insurance
  • Short and long-term disability insurance
  • Flexible Spending Account
  • Health Savings Account
  • Tuition Reimbursement
  • Ability to Participate in Employee Stock Purchase Program (ESPP)
  • Mental Wellness Benefits through Spring Health
  • Family-Forming support provided by Carrot
  • Paid Parental Leave
  • Flexible, full-service childcare support with Kinside
  • 401(k) with a generous employer match
  • Flexible PTO
  • Catered lunch each day in our office and data center locations
  • A casual work environment
  • A work culture focused on innovative disruption
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service