Software Engineer Streaming

Retail Services WIS CorporationPlano, TX
2d$120,000 - $150,000Hybrid

About The Position

We are seeking a Streaming Platform Engineer to design, implement, and operate high-throughput, low-latency event streaming systems using modern distributed messaging platforms (e.g., Apache Kafka, Apache Pulsar, Azure Event Hubs, Amazon Kinesis, Google Pub/Sub). You will build resilient producers/consumers, manage schema evolution, ensure exactly-once delivery, and integrate streaming data with RDBMS, NoSQL, and analytics systems.

Requirements

  • Apache Kafka or Apache Pulsar (deep expertise in at least one); familiarity with Kinesis, Event Hubs, Pub/Sub, Redpanda
  • Go, Java, or Scala
  • Kafka Streams, ksqlDB, Pulsar Functions, Flink, Spark Streaming
  • Avro, Protobuf, JSON Schema, Schema Registry (Confluent, Apicurio, Pulsar Schema)
  • Debezium, Kafka Connect, Pulsar IO, AWS DMS, Azure CDC
  • AzureSQL/PostgreSQL/MySQL (indexing, partitioning), Cassandra/DynamoDB/MongoDB
  • AWS, GCP, Azure; Docker, Kubernetes, Terraform, CI/CD

Nice To Haves

  • Migrated from Kafka to Pulsar (or vice versa) in production.
  • Built multi-tenant streaming platforms with isolation and quota enforcement.
  • Used event sourcing, CQRS, or domain-driven design with streams.
  • Contributed to Strimzi, Pulsar Operators, or open-source connectors.
  • Certified: Confluent Certified Developer, Databricks Apache Spark, AWS Data Analytics, etc.

Responsibilities

  • Design and manage streaming clusters (Kafka, Pulsar, Event Hubs, Kinesis) across cloud environments.
  • Configure topics, partitions, retention, replication, and geo-redundancy.
  • Implement schema management (Schema Registry, Avro, Protobuf, JSON).
  • Ensure high availability and disaster recovery.
  • Build fault-tolerant producers/consumers in Java, Scala, Python, or Go.
  • Develop real-time pipelines using Kafka Streams, Pulsar Functions, Flink, Spark Streaming, or Kinesis Analytics.
  • Guarantee reliable delivery and ordered processing.
  • Deploy CDC pipelines (Debezium, Maxwell, MongoDB Change Streams).
  • Sync data to warehouses (Snowflake, BigQuery, Redshift) and operational DBs (PostgreSQL, Cassandra, DynamoDB).
  • Design schemas, indexes, and partitioning in RDBMS (AzureSQL, PostgreSQL, MySQL) and NoSQL (Cassandra, DynamoDB, MongoDB) for high-velocity writes.
  • Optimize query performance for event-sourced or streaming-derived data.
  • Manage data consistency between streams and persistent stores (event sourcing, CQRS patterns).
  • Implement monitoring (Prometheus, Grafana, Datadog).
  • Set up alerts, schema checks, and dead-letter queues.
  • Automate deployments with Terraform, Helm, Kubernetes Operators.
  • Write integration and resilience tests.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service