Senior Data Engineer, Event Data

Movable InkToronto, ON

About The Position

Movable Ink scales content personalization for marketers through data-activated content generation and AI decisioning. The world’s most innovative brands rely on Movable Ink to maximize revenue, simplify workflow and boost marketing agility. Headquartered in New York City with close to 600 employees, Movable Ink serves its global client base with operations throughout North America, Central America, Europe, Australia, and Japan. The Senior Data Engineer, Event Data will join a newly formed team responsible for building the systems that ingest, process, and serve the massive volume of client and internal event data that powers Movable Ink's platform. This is a ground-floor opportunity to shape the architecture of a critical data domain, working with modern streaming technologies, analytical databases, and Elixir and Python to solve problems at billions-event scale. You will help define how event data flows through the platform, making it faster, more reliable, and highly available for teams and clients across the organization.

Requirements

  • 6+ years of professional experience in data engineering or backend/systems engineering, with significant focus on event-driven and streaming data systems
  • Strong proficiency in Elixir and/or Python as a primary programming language for building application connectors, data services and pipeline components
  • Advanced SQL skills for data modeling, query optimization, and analytical workloads
  • Hands-on experience with columnar/OLAP (Online Analytical Processing) databases at production scale
  • Experience with stream processing frameworks and message brokers such as Apache Flink, Kafka, Pulsar, or Kinesis; Flink experience is a strong plus
  • Demonstrated ability to integrate and migrate systems, bridging legacy and modern architectures
  • Proven track record of operationalizing data pipelines, including building monitoring, alerting, SLA dashboards, and runbooks for production systems
  • Experience designing and operating data systems on AWS; GCP experience is a plus
  • Strong collaboration and communication skills, comfortable leading design discussions, writing technical specs, and working across team boundaries
  • Experience with Infrastructure-as-Code (IaC) tools such as Terraform, CloudFormation, or similar

Nice To Haves

  • Experience with retail events data such as clickstream, purchase events, or product interaction data is a plus
  • Experience with Databricks, Flink and message brokers such as Kafka, Pulsar, or Kinesis is a plus

Responsibilities

  • Design, build, and maintain event streaming pipelines that ingest data from client systems, internal services, and third-party sources into the data platform
  • Develop and operate analytical databases and data models optimized for high-volume event data queries and low-latency access
  • Write production Elixir and Python services for event processing, transformation, and routing
  • Integrate legacy event pipelines with modern streaming infrastructure, designing migration paths that minimize risk and disruption to downstream consumers
  • Build and maintain monitoring, alerting, and observability tooling for event data systems, ensuring pipeline health, data freshness, and SLA compliance
  • Define and enforce event schemas, data contracts, and quality standards in partnership with producing and consuming teams
  • Collaborate with the data platform, product engineering, and analytics teams to understand data needs and deliver reliable event data products
  • Participate in system design reviews and help establish best practices for the Events Data team
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service