About The Position

Airbnb was born in 2007 when two hosts welcomed three guests to their San Francisco home, and has since grown to over 5 million hosts who have welcomed over 2 billion guest arrivals in almost every country across the globe. Every day, hosts offer unique stays and experiences that make it possible for guests to connect with communities in a more authentic way. The Community You Will Join: The mission of the DBExports team in the Online Database Infrastructure org is to provide a managed, reliable, performant and scalable platform for exporting data out of Airbnb’s online databases for offline processing. The team currently builds and operates two managed solutions - one for near real time access to change logs (CDC) and another to provide periodic mutation and full table snapshots. As a member of this team you would be working with talented engineers on cutting edge technologies to help support Airbnb’s business. You will be the resident expert on the online data exports platform and guide and collaborate with our internal product teams to effectively use the platform. The Difference You Will Make: Build and operate data ingestion system that enables various ways of accessing data at Airbnb, including ingest DB data in the warehouse in various formats and frequency, and stream change data capture (CDC) at near real time. Be hands-on (code, design, test) and collaborate with cross team partners (internal customers, dependencies and leadership) to deliver on multi-month projects in a timely fashion. Raise operational standards by effectively and proactively identifying, debugging and fixing operational issues. Be part of the oncall rotation for the DBExports platform. Mentor junior engineers on the team.

Requirements

  • 5+ years of experience building and operating large scale core backend distributed systems like storage, data ingestion, backup and restore, streaming.
  • Ability to own and dive deeply in a complex code base.
  • Experience maintaining, analyzing, and debugging production systems
  • Knack for writing clean, readable, testable, maintainable code.
  • Strong collaboration and communication skills in a remote-working environment.
  • Demonstrate strong ownership and consistently deliver in a timely manner.
  • Experience working in either Java, Scala or Python.

Nice To Haves

  • Experience with building large scale data exports/ingestion platforms.
  • Experience with building large scale distributed databases.
  • Experience with AWS and/or GCP.
  • Experience working with SPARK, Kafka, Flink, K8s, Airflow, AWS Aurora or TiDB.

Responsibilities

  • Build and operate data ingestion system that enables various ways of accessing data at Airbnb, including ingest DB data in the warehouse in various formats and frequency, and stream change data capture (CDC) at near real time.
  • Be hands-on (code, design, test) and collaborate with cross team partners (internal customers, dependencies and leadership) to deliver on multi-month projects in a timely fashion.
  • Raise operational standards by effectively and proactively identifying, debugging and fixing operational issues. Be part of the oncall rotation for the DBExports platform.
  • Mentor junior engineers on the team.

Benefits

  • This role may also be eligible for bonus, equity, benefits, and Employee Travel Credits.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service