Senior Analytics Engineer

Victory LiveAtlanta, GA

About The Position

The Sr. Analytics Engineer will sit at the intersection of data infrastructure and intelligent automation, helping Victory Live's business intelligence function evolve from reactive reporting to proactive, AI-augmented insight. You'll build and maintain the data foundation that powers our decision-making and the pipelines and models that feed automated and agentic workflows across the business.

Requirements

  • 5+ years of experience in analytics engineering, data engineering, or a closely related role.
  • Strong SQL fundamentals; comfortable writing complex transformations, debugging query performance, and reasoning about data model design.
  • Proficiency in Python for data processing, automation, and API integration.
  • Hands-on experience with a cloud data warehouse (Snowflake strongly preferred).
  • Experience with dbt or a comparable transformation framework.
  • Familiarity with workflow orchestration tools (Dagster, Airflow) or ELT services (Fivetran, ADF).
  • Demonstrated comfort using AI coding assistants as a routine part of development — not as a novelty, but as a productivity multiplier with appropriate critical judgment.
  • Solid understanding of data modeling concepts: normalization, dimensional modeling, schema design, and data lineage.
  • Strong communication skills and a bias toward documentation and knowledge sharing.
  • Demonstrated ability to operate as a senior individual contributor — influencing technical direction, contributing to planning processes, and supporting the development of teammates

Nice To Haves

  • Exposure to agentic or LLM-based workflows; e.g., building tool-use contexts, structured output pipelines, retrieval-augmented generation (RAG) data layers, or similar.
  • Experience with business intelligence platforms (Sigma, Looker, Tableau, or Power BI).
  • Familiarity with vector databases or semantic search infrastructure (Pinecone, pgvector, etc.).
  • Experience in cloud environments (Azure preferred, AWS or GCP also relevant).
  • Understanding of data governance, quality frameworks, and observability tooling (dbt tests, Great Expectations, Monte Carlo, etc.).
  • Comfort in fast-moving, ambiguous environments where the right answer is often "build it, learn from it, iterate."

Responsibilities

  • Design, build, and maintain robust ELT pipelines using tools like dbt, Snowflake, Dagster, and Azure Data Factory, applying AI-assisted development practices to accelerate delivery and reduce toil.
  • Write efficient SQL and Python, leveraging AI coding assistants (Cursor, GitHub Copilot, Claude, etc.) to improve velocity without sacrificing code quality or reviewability.
  • Build and maintain semantic data models including fact/dimension tables, metrics layers, and reusable datasets that serve both human analysts and programmatic/agentic consumers downstream.
  • Develop pipelines and data structures that support AI-powered features, including LLM context retrieval, embedding generation, structured output ingestion, and agent tool integrations.
  • Support BI and reporting platforms (Sigma) by ensuring data is well-modeled, documented, and performant.
  • Monitor pipelines and proactively surface data quality, reliability, and freshness issues, exploring automated alerting and remediation patterns where appropriate.
  • Maintain clear documentation of models, transformations, and lineage; contribute to an internal data catalog that enables both human and AI discoverability.
  • Integrate third-party APIs and internal services into pipelines, with awareness of how those data sources may feed automated workflows or agent-facing tools.
  • Participate in code reviews, testing, and Git-based workflows; hold a high bar for data integrity regardless of how code was generated.
  • Contribute to team goal-setting and technical roadmap planning, bringing a perspective to prioritization and long-term data infrastructure decisions.
  • Mentor and develop junior and mid-level team members in best practices, SQL, data modeling, and the effective use of AI development tools.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service