Senior Data Developer - Streaming

MaintainXMontreal, QC

About The Position

MaintainX is the world's leading Asset and Work Intelligence platform for industrial and frontline environments. We are a modern IoT-enabled cloud-based tool for reliability, safety, and operations on physical equipment and facilities. MaintainX powers operational excellence for 13,000 businesses including Duracell, Univar Solutions Inc., Titan America, McDonald's, Brenntag, Cintas, Xylem, and Shell. We recently completed a $150 million Series D funding round, bringing our total funding to $254 million and valuing the company at $2.5 billion. We are seeking a Senior Data Developer with a strong background in event streaming to join our growing team. You will help build and maintain the data streaming platform that directly powers the MaintainX product and enables internal analytics, while developing streaming platform capabilities and tooling used by engineering teams daily. You will own the data platform’s CDC-streaming service, including runtime, reliability, capabilities, deployment, governance, and developer tooling.

Requirements

  • 4+ years of experience building and operating production-grade event streaming pipelines in a modern cloud data environment
  • Strong familiarity with Kafka: topic design, consumer groups, retention policies, event replayability, schema management, partitioning, and indexing
  • Experience with CDC tooling (ex: Debezium, DMS) for real-time database change capture
  • Hands-on experience with Apache Flink
  • Strong infrastructure-as-code skills with Terraform or Atmos; comfortable managing cloud infrastructure across multiple AWS accounts and regions
  • Proficiency in Python or Java for Flink application development and streaming tooling
  • Experience building and evolving CI/CD pipelines
  • Strong reliability engineering instincts: alerting design, runbook authorship, load testing, and failure recovery planning for distributed systems
  • Experience working collaboratively in a fast-paced, cross-functional environment

Nice To Haves

  • Familiarity with schema management in a CDC context
  • Familiarity with OpenSearch
  • Knowledge of compliance and regulatory frameworks (ex: FedRAMP, SOC2, GDPR)

Responsibilities

  • Build and operate the end-to-end CDC streaming platform (Debezium, Kafka, Flink) that produce near-real-time data products
  • Own the streaming infrastructure (Kafka, Flink) using Terraform and Atmos IaC, including multi-region deployments
  • Build and maintain CI/CD pipelines for the CDC-streaming platform
  • Define and enforce pipeline reliability standards
  • Instrument and maintain end-to-end observability for the streaming pipeline
  • Build self-service tooling and runbooks for onboarding new CDC sources, including automation scripts, snapshot reconciliation checks, and operational documentation
  • Collaborate with engineering teams to expand the CDC footprint, support new streaming data use cases, and evolve the streaming architecture

Benefits

  • Competitive salary and meaningful equity opportunities.
  • Healthcare, dental, and vision coverage.
  • 401(k) / RRSP enrollment program.
  • Take what you need PTO.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

251-500 employees

© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service