About The Position

We’re hiring our first Senior Data Engineer to architect, build, and own our end-to-end data platform. Our position in the AI marketplace gives us uniquely rich usage and performance data; your job is to turn that data into reliable and scalable datasets and products that deliver clear value to customers while informing our own product decisions. You’ll stand up our data warehousing and pipelines from scratch, connect product and platform data, and make high-leverage data accessible and trustworthy to our customers and across the company.

Requirements

  • 4+ years as a Data Engineer (or similar), including owning production pipelines and a modern data warehouse (Clickhouse, Snowflake, Databricks, etc)
  • Expert SQL and Python with deep experience with ETL/ELT design, data modeling, and performance tuning.
  • Experience building end-to-end data infrastructure: storage, compute, networking, orchestration, CI/CD for data, monitoring/alerting, and cost management.
  • Excellent communicator who can self-manage, set expectations, and partner across functions while working asynchronously.
  • You’re customer-obsessed and product-minded, starting from the user problem and shipping pragmatically to deliver value fast.
  • Biased to action, comfortable with ambiguity, and able to prioritize for impact. You default to simple, reliable solutions and iterate quickly.

Nice To Haves

  • Experience as the first/early data hire building 0 to 1
  • Direct experience with Clickhouse, Postgres, GCP, or Terraform
  • Experience standing up event streaming platforms (e.g. Kafka, Pub/Sub)
  • Experience with database privacy/compliance standards (e.g., SOC 2, GDPR/CCPA)

Responsibilities

  • Stand up central analytics store. Define schemas, partitioning, retention, and performance strategies for scale. Establish data contracts and documentation to keep data consistent and discoverable.
  • Build, operate, and monitor robust ETL/ELT pipelines. Implement CDC and batch/stream ingestion patterns; ensure idempotency, safe backfills, and predictable reprocessing.
  • Design and ship the data foundations for customer-facing metrics (latency, throughput, error rates, reliability/SLOs, cost), with strong handling of late/duplicate events. Build secure, multi-tenant datasets and APIs for customers; enforce isolation with row/column-level security, access controls, and privacy guardrails.
  • Take data features from concept to production: design, implement, backfill/migrate, document, and support.

Benefits

  • Attractive salary, equity, and benefits that reflect the impact you’ll have on our success.
  • Be at the forefront of AI and LLM infrastructure, building tools for developers and enterprises alike.
  • Work closely with a tight-knit group of top-notch engineers who value excellence and innovation.
  • As we scale, grow into leadership roles or specialize further in your area of interest.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service