About The Position

We're building the future of enterprise data systems - creating a canonical data product that serves as the single source of truth for data while enabling next-generation AI-powered experiences. This isn't just moving and managing data; it's architecting coupled, intelligent systems that power both traditional analytics and agent capabilities at scale. You will lead key core components of our multi-layered data architecture spanning raw ingestion through AI-ready semantic layers, working with modern cloud technologies. The technical challenges are substantial: distributed systems design, near real-time processing, complex transformations, data quality frameworks, and emerging AI architecture. We are looking for someone who combines exceptional data engineering fundamentals with deep belief in AI capabilities - the ideal candidate is comfortable designing systems, delivering autonomously and thinking holistically about the future of data as a product. You will collaborate across data science, analytics, and software engineering teams with real influence on architectural decisions. This position is part of the AWS Specialist and Partner Organization (ASP). Specialists own the end-to-end go-to-market strategy for their respective technology domains, providing the business and technical expertise to help our customers succeed. Partner teams own the strategy, recruiting, development, and growth of our key technology and consulting partners. Together they provide our customers with the expertise and scale needed to build innovative solutions for their most complex challenges.

Requirements

  • 5+ years of data engineering experience
  • Experience with data modeling, warehousing and building ETL pipelines
  • Experience with SQL
  • Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS
  • Experience mentoring team members on best practices

Nice To Haves

  • Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
  • Experience operating large data warehouses

Responsibilities

  • Champion and apply agentic development practices in daily work - leveraging agent-based assistants and tools to improve code quality and pioneer innovative approaches to data engineering challenges.
  • Own the evolution of our AI-first data platform. You will be responsible for system designs, implementing scalable frameworks, mentoring builders, and evolving data governance systems that securely enable intelligent decision-making
  • Champion data as a product and collaborate with data scientists, analysts, software and product teams to understand requirements and architect data solutions that enable both traditional analytics and agentic capabilities.
  • Design and evolve data quality frameworks and validation systems to maintain trust in our canonical datasets, including automated monitoring, anomaly detection, and remediation workflows
  • Design, build, and maintain scalable data pipelines that ingest, transform, and deliver high-quality datasets across our multi-layered architecture - from raw data landing through production ready data marts.
  • Develop and optimize data transformations using SQL and Python to support both analytical workloads and AI-ready semantic layers, ensuring data accuracy, consistency, and performance at scale.

Benefits

  • health insurance (medical, dental, vision, prescription, Basic Life & AD&D insurance and option for Supplemental life plans, EAP, Mental Health Support, Medical Advice Line, Flexible Spending Accounts, Adoption and Surrogacy Reimbursement coverage)
  • 401(k) matching
  • paid time off
  • parental leave
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service