About The Position

Airbnb was born in 2007 when two hosts welcomed three guests to their San Francisco home, and has since grown to over 5 million hosts who have welcomed over 2 billion guest arrivals in almost every country across the globe. Every day, hosts offer unique stays and experiences that make it possible for guests to connect with communities in a more authentic way. Airbnb was born in 2007 when two Hosts welcomed three guests to their San Francisco home, and has since grown to over 4 million Hosts who have welcomed more than 1 billion guest arrivals in almost every country across the globe. Every day, Hosts offer unique stays and experiences that make it possible for guests to connect with communities in a more authentic way. The Community You Will Join: Communication and Connectivity Data: We build systems and tools for stays, experiences, and beyond that support first class rich communications and connections among all members of the Airbnb community. We create scalable and intelligent communication tools, including messaging and notifications. The data team utilizes industry-leading tools, builds scalable data systems and applies AI models to provide insights and support all products in CnC. The Difference You Will Make: Analytics Engineers build the data foundation for reporting, analysis, experimentation, and machine learning at Airbnb. We are looking for someone with expertise in metric development, data modeling, SQL, Python, and large scale distributed data processing frameworks like Presto or Spark. Using these tools, you will transform data from data warehouse tables into valuable data artifacts that power impactful analytic use cases (e.g. metrics, dashboards). You will sit at the intersection of data science and data engineering, and work collaboratively to achieve highly impactful outcomes.

Requirements

  • 5+ years of experience with a BS/Masters or 2+ years with a PhD
  • Expertise in SQL and proficient in at least one data engineering language, such as Python or Scala
  • Experience with Superset and Tableau
  • Experience with Event Logging modeling
  • Expertise in large-scale distributed data processing frameworks like Presto or Spark
  • Experience with an ETL framework like Airflow
  • Clear and mature communication skills: ability to distill complex ideas for technical and non-technical stakeholders
  • Strong capability to forge trusted partnerships across working teams

Responsibilities

  • Develop high quality data assets for product use-cases
  • Develop frameworks and tools to scale insight generation to meet critical business and product requirements
  • Collaborate and build strong partnerships with Product, AI/ML and Data Science.
  • Influence the trajectory of data in decision making
  • Improve trust in our data by championing for data quality across the stack
  • Influence event logging instrumentation best practices and participate in architecture designs

Benefits

  • This role may also be eligible for bonus, equity, benefits, and Employee Travel Credits.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service