Fox-posted about 20 hours ago
$143,000 - $177,000/Yr
Full-time • Mid Level
Hybrid • Los Angeles, CA

We are seeking a Senior Data Engineer to join our Product Research & Analytics team. In this role, you will be responsible for building and maintaining the data infrastructure that powers product decision-making across our DTC streaming platforms. You’ll play a key role in designing data pipelines, ensuring data quality and observability, and enabling the integration of product event data with Media Cloud Platforms (MCPs) and third-party analytics systems. You’ll partner closely with Product, Analytics, and Platform Engineering to scale our data ecosystem and support both internal and external use cases. This role requires strong technical expertise, a proactive mindset, and the ability to independently drive initiatives from design to deployment.

  • Design, build, and maintain scalable ETL/ELT pipelines using Python and SQL
  • Optimize data workflows for event-level modeling and large-scale analytics use cases
  • Lead integration of product data with MCPs and third-party measurement or marketing platforms
  • Collaborate with Analytics and Product Engineering to improve instrumentation, taxonomy, and event tracking consistency
  • Design and implement observability frameworks to detect schema drift, latency, volume anomalies, and data quality issues
  • Develop and enforce best practices around data modeling, validation, and governance
  • Partner with Data Platform and Infrastructure teams to evolve our cloud-based lakehouse architecture (Snowflake, Databricks)
  • Troubleshoot and resolve complex data pipeline issues independently
  • Document pipeline logic, integration specs, and data contracts for cross-team collaboration
  • Support the migration of legacy systems to modern data architectures
  • Expertise in SQL with experience handling complex, high-volume analytical queries
  • Proficiency in Python for ETL development, orchestration, and data validation
  • Hands-on experience working with Snowflake and/or Databricks in production environments
  • Familiarity with event-based product analytics tools (Heap, Amplitude, Segment, etc.)
  • Deep understanding of event data models, including user identity stitching, funnel tracking, and page/user interactions
  • Experience designing and operating integrations between first-party product data and third-party systems (e.g., MCPs, clean rooms, analytics vendors)
  • Experience with observability and alerting tools like Grafana, Datadog, Monte Carlo, or equivalents
  • Strong collaboration and communication skills in cross-functional engineering settings
  • Ability to take ownership of ambiguous problems and drive scalable technical solutions
  • Experience architecting or contributing to data integrations with media measurement, marketing, or audience platforms
  • Experience with Databricks migrations or transitioning from cloud warehouse to lakehouse environments
  • Exposure to governance and tagging frameworks, including PII/data privacy handling
  • Experience in streaming or media analytics environments (video playback, engagement funnels, metadata modeling)
  • Familiarity with data contract enforcement or CI/CD for data pipelines
  • Experience managing schema registries, metadata cataloging, or automated testing for pipelines
  • medical/dental/vision
  • insurance
  • a 401(k) plan
  • paid time off
  • other benefits in accordance with applicable plan documents
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service