Senior Data Engineer

Lytx, Inc.San Diego, CA
7d$136,000 - $172,000

About The Position

At Lytx, we make roadways safer by transforming real‑time video and telematics into actionable safety intelligence. The Data Platform is the backbone of that mission - ingesting and processing massive event streams, shaping trusted datasets, and serving low‑latency insights that power products and analytics across the company. As a Senior Data Engineer, you’ll design, build, and operate high‑scale streaming and batch pipelines, optimize our Redshift data lake, and raise the bar on data quality, reliability, and cost efficiency.

Requirements

  • 6+ years of data engineering experience designing and operating production‑grade streaming + batch data systems at scale.
  • Deep hands‑on experience with Kafka and stream processing technology
  • Strong SQL and performance tuning in Redshift; practical experience with Postgres.
  • Production experience with Airflow (DAG design, SLAs, retries, idempotency) and Flink (job optimization).
  • Experience with open table formats - Apache Iceberg (preferred) including partition evolution, schema evolution; comfort with Athena query performance.
  • Solid data modeling for telemetry/events and analytics; understanding of data contracts, schema registry, and CDC patterns.
  • Proficiency in object-oriented programming languages such as C# / Java / Python; strong software engineering fundamentals (testing, CI/CD, code review).
  • AWS experience (S3, Redshift, Athena, IAM; EKS is a plus).
  • Clear communicator who owns outcomes, balances speed with rigor, and thrives in a collaborative, high‑impact environment.

Nice To Haves

  • Experience with IoT/telematics data.
  • Knowledge of dbt / Great Expectations (or other data testing frameworks).
  • Terraform/IaC, Kubernetes, and cost‑aware architecture.

Responsibilities

  • Build and operate streaming pipelines with Kafka and streaming technologies (i.e. Flink) for real‑time telemetry and event processing.
  • Design robust ETL with Airflow into Redshift and S3, supporting analytics, product use cases, and downstream services.
  • Model data (event/telemetry and serving schemas); own data contracts, schema evolution, and table partitioning/compaction strategies.
  • Optimize performance & cost across Redshift, Iceberg/Athena, and streaming jobs
  • Harden quality, reliability, and observability: SLAs/SLOs, lineage, testing, validation, alerting, and incident response for data pipelines and datasets.
  • Serve data to applications via APIs/GraphQL and scalable query services (Athena/Redshift), ensuring predictable latency and well‑documented contracts.
  • Collaborate cross‑functionally with application teams to translate business needs into durable, scalable data assets.
  • Mentor engineers and contribute to standards, patterns, and best practices; be a continuous learner who uplifts team capabilities.

Benefits

  • Medical, dental and vision insurance
  • Health Savings Account
  • Flexible Spending Accounts
  • Telehealth
  • 401(k) and 401(k) match
  • Life and AD&D insurance
  • Short-Term and Long-Term Disability
  • FTO or PTO
  • Employee Well-Being program
  • 11 paid holidays plus 1 inclusive holiday per year
  • Volunteer Time Off
  • Employee Referral program
  • Education Reimbursement Program
  • Employee Recognition and Appreciation program
  • Additional perk and voluntary benefit programs
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service