Senior Data Engineer

Fundraise Up
Remote

About The Position

As a Senior Data Engineer, you will be responsible for designing, building, and optimizing scalable data pipelines and ETL/ELT processes. Initially, you will be the first engineer in this domain, taking full ownership of the data domain. As the product and data needs grow, there will be an opportunity to contribute to team development. This role requires a high level of autonomy and ownership, combined with close collaboration with analytics, Data Science, and engineering teams. Fundraise Up is a global fundraising platform that powers tens of millions of dollars in donations monthly for nonprofits like UNICEF and the Alzheimer’s Association. The platform includes a modern checkout experience, customizable widgets, donor, organization, and partner portals, admin tools, and internal apps. The backend uses Node.js (Koa, NestJS) and MongoDB, while the frontend uses Webpack, Vue.js, and React, primarily with TypeScript. Kafka and Bull (Redis) are used for high-throughput messaging and background processing, ClickHouse for analytics data, and Elasticsearch for search. The team is distributed across various European countries, fostering a culture of technical curiosity, knowledge sharing, thoughtful collaboration, strong engineering practices, and a product mindset.

Requirements

  • 7+ years of experience as a Data Engineer.
  • 5+ years of experience with Python, TypeScript, Node.js, Kafka.
  • Strong understanding of data processing algorithms and principles.
  • Hands-on experience with ClickHouse, Airflow, Amazon S3, Git, Docker.
  • Solid understanding of Data Lake and Data Warehouse architectures.
  • Experience working with large-scale data and query optimization.
  • Ability to work collaboratively toward shared goals.
  • Strong sense of ownership, responsibility, and proactivity.
  • English level: B1+.

Nice To Haves

  • Experience with Apache Parquet, MLflow, MongoDB.

Responsibilities

  • Design and evolve the architecture of the data platform and storage systems.
  • Build reliable ETL/ELT processes and develop scalable data pipelines for delivering data into a centralized analytical warehouse.
  • Maintain and further develop the Data Warehouse.
  • Collaborate with engineering and analytics teams on system design and architectural decisions.
  • Ensure data governance and maintain high standards of data quality.
  • Write and optimize queries for MongoDB and ClickHouse.
  • Manage and maintain workflows in Airflow.

Benefits

  • 31 days off
  • 100% paid telemedicine plan
  • Home Office Setup Assistance: the company offers assistance with purchasing furniture (office chair, office desk, monitor) and other items to create a comfortable workspace
  • English learning courses
  • Relevant professional education
  • Gym or swimming pool
  • Co-working
  • Remote working
  • A strong, collaborative product team that owns what it builds
  • Clear product vision and access to real customer feedback from global nonprofit leaders
  • Flat structure: no politics, just great work with great people
  • Transparent company culture-we share how we’re growing, where revenue comes from, and what’s next
  • Long-term focus: we offer equity options and value sustained, meaningful contribution
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service