Senior Data Engineer - Knowledge Platform

AirwallexSan Francisco, CA

About The Position

Airwallex is the only unified payments and financial platform for global businesses. Powered by our unique combination of proprietary infrastructure and software, we empower over 200,000 businesses worldwide – including Brex, Rippling, Navan, Qantas, SHEIN and many more – with fully integrated solutions to manage everything from business accounts, payments, spend management and treasury, to embedded finance at a global scale. Proudly founded in Melbourne, we have a team of over 2,000 of the brightest and most innovative people in tech across 26 offices around the globe. Valued at US$8 billion and backed by world-leading investors including T. Rowe Price, Visa, Mastercard, Robinhood Ventures, Sequoia, Salesforce Ventures, DST Global, and Lone Pine Capital, Airwallex is leading the charge in building the global payments and financial platform of the future. If you’re ready to do the most ambitious work of your career, join us. We hire successful builders with founder-like energy who want real impact, accelerated learning, and true ownership. You bring strong role-related expertise and sharp thinking, and you’re motivated by our mission and operating principles. You move fast with good judgment, dig deep with curiosity, and make decisions from first principles, balancing speed and rigor. You're humble and collaborative; turn zero‑to‑one ideas into real products, and you “get stuff done” end-to-end. You use AI to work smarter and solve problems faster. Here, you’ll tackle complex, high‑visibility problems with exceptional teammates and grow your career as we build the future of global banking. If that sounds like you, let’s build what’s next. These four areas are the main focuses of the DE team. Ideally, you should be strong in at least one of them — especially data modeling or data ETL. If you also have experience or skills in the other areas, that would be a big plus.

Requirements

  • Bachelor’s degree or higher in Computer Science, Information Systems, Finance, Maths or a related field.
  • Minimum 5 years of proven of experience designing and implementing ETL pipelines using tools such as Informatica, Talend, Apache NiFi, or similar data integration platforms.
  • Proficiency in SQL, database management systems (e.g., MySQL, PostgreSQL, Oracle), and data warehousing solutions.
  • Familiarity with Google Cloud Platform (GCP), specifically BigQuery and Airflow.
  • Excellent problem-solving skills, with a keen attention to detail and a commitment to producing high-quality work.
  • Strong communication and collaboration skills, with the ability to work effectively in a fast-paced, team-oriented environment.
  • Outstanding verbal communication skills, with the ability to effectively collaborate with globally distributed teams.

Nice To Haves

  • Experience with financial industries, payment systems, or fintech platforms.
  • Knowledge of data governance practices and regulatory requirements in the financial industry.
  • Experience with scripting languages (e.g., Python, R) for data analysis and automation.
  • Certification in data management or related technologies
  • Worked with data across distributed or multi-datacenter systems, including solving challenges related to data migration, duplication, and consistency.

Responsibilities

  • Design and implement robust and scalable data models that support business intelligence, machine learning, and operational needs.
  • Possess a deep understanding of data schemas and be able to select appropriate schema designs (e.g., star schema, snowflake, normalised vs denormalised) based on use cases.
  • Collaborate closely with business teams to translate their data needs into clean, structured, and well-documented models.
  • Understand and promote the concept of SSOT (Single Source of Truth) throughout the data layers and pipelines.
  • Maintain data consistency, traceability, and quality across multiple data sources and domains.
  • Experience building and maintaining both batch and streaming ETL pipelines, with a strong understanding of end-to-end data workflow — from data ingestion to transformation and delivery.
  • Able to work closely with Data Platform Engineers (DPEs) and Product Managers (PMs) to quickly identify root causes of data issues and provide efficient, scalable solutions.
  • Participate in and contribute to data governance strategies, policies, and standards.
  • Be familiar with any of the six key pillars of traditional data governance (e.g., data quality, data stewardship, metadata management, master data management, data privacy/security, data lifecycle).
  • Think about how data engineering and AI can work together in practical and creative ways.

Benefits

  • Competitive salary
  • Valuable equity
  • Collaborative open office space with a fully stocked kitchen.
  • Regular team-building events
  • Freedom to be creative
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service