Lead Data Engineer

Inception Point AI
2dHybrid

About The Position

Inception Point AI is a media-tech company building the next generation of AI-powered personalities, shows, and content brands. We blend cutting-edge AI tools with storytelling, design, and production discipline to create scalable, high-quality content across YouTube, social platforms, and podcasts. Our work sits at the intersection of creativity, automation, and entertainment — and we’re building systems that allow great ideas to move fast. We are looking for a Lead Data Engineer who is not just an AI enthusiast, but a builder and visionary for our data ecosystem. Working on the frontier of AI and multi-agent orchestration, you will take ownership of our proprietary analytics system, transitioning it from a developer-led prototype into a production-grade engine. You will be the technical architect responsible for how we collect, store, and transform data to drive meaningful revenue, audience reach, and strategic licensing opportunities. This is a "hybrid territory" role requiring a deep engineering core combined with a business analytics mindset. You will bridge the gap between complex data infrastructure and non-technical stakeholders, ensuring that our data is not just available, but accessible and actionable for everyone in the organization.

Requirements

  • Technical Mastery: Complete mastery of SQL and Airflow, along with a high proficiency with Python.
  • Cloud Infrastructure: Deep familiarity with typical frameworks for cloud-based data storage and management.
  • High-Volume Experience: Proven track record of working with very large datasets and high transaction volumes.
  • Analytical Rigor: Ability to perform at least basic analysis on data, identifying issues, transforming datasets, and looking at information from multiple perspectives.
  • Communication: Strong "people skills" with a genuine interest in interviewing stakeholders to understand the business questions they need the data to answer.

Responsibilities

  • System Ownership & Evolution: Partner with the engineering team to lead the knowledge transfer of "Atlas" (built in Python), taking full architectural ownership and ensuring its continued health and progression.
  • Data Ingestion & Health: Using Apache Airflow to manage and optimize the daily ingestion of analytics from multiple disparate sources (social media, podcast platforms, etc.) to ensure a clean, reliable, and "healthy" data stream.
  • Strategic Data Discovery: Act as a "data detective" to identify what information we are missing and prioritize new data collection that aligns with our financial stability and growth levers.
  • Self-Service Enablement: Build and maintain intuitive dashboards (e.g., Power BI or custom builds) that allow non-technical peers to answer basic everyday questions without needing manual engineering support.
  • Data Productization: Transform raw data into structured, "customer-ready" packages to support our heavy push into data licensing.
  • Build vs. Buy Advisory: Utilize your knowledge of the media analytics landscape to guide the organization on whether to build custom internal tools or leverage existing 3rd-party social media analytics solutions.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service