Notion-posted 4 days ago
Full-time • Mid Level
Onsite • San Francisco, CA
11-50 employees

As Notion continues to grow rapidly, we're seeking talented data engineers to join our team and help us build the foundational datasets and pipelines for robust financial reporting. You'll be at the forefront of integrating our product, financial, and business systems to create rock solid processes that will propel us forward. If you're passionate about analytics use cases, data models, and solving complex data problems, then we want you on our team.

  • You'll build core datasets to serve as the sources of truth for Notion's financial reporting, integrating data from financial systems, business systems data and Notion's product.
  • You'll partner closely with our Finance, Monetization Engineering, Business Intelligence and Data Science teams to support critical financial reporting and analysis needs.
  • You'll design, build and monitor pipelines that meet today's requirements but can gracefully scale with our growing data size.
  • You'll help democratize access to high quality financial data across Finance, Staff and go-to-market teams.
  • You've spent 4+ years as a data engineer building core datasets and supporting business verticals as needed, ideally in product and business areas with high data volumes.
  • You are passionate about analytics use cases, data models and solving complex data problems.
  • You've built integrations with and reporting datasets for payments, finance and business systems like Stripe, Netsuite, Adaptive, Anaplan, Salesforce and/or others.
  • You are a self-starter and continuously gather and synthesize high-impact needs from business partners, design and implementing the appropriate technical solutions, and effectively communicating about deliverables, timelines and tradeoffs
  • You have hands-on experience shipping scalable data solutions in the cloud (e.g AWS, GCP, Azure), across multiple data stores (e.g Snowflake, Redshift, Hive, SQL/NoSQL, columnar storage formats) and methodologies (e.g dimensional modeling, data marts, star/snowflake schemas)
  • You are a SQL expert. You intimately understand aggregation functions, window functions, UDFs, self-joins, partitioning and clustering approaches to run correct and highly-performant queries
  • You are comfortable with object-oriented programming paradigms (e.g Python, Java, Scala)
  • You have hands-on experience in designing and building highly scalable and reliable data pipelines using BigData stack (e.g Airflow, DBT, Spark, Hive, Parquet/ORC, Protobuf/Thrift, etc)
  • You have hands-on experience building payment processing and invoice systems or have worked closely with teams that do this.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service