Senior Data Engineer (Snowflake & dbt)

Teladoc HealthUniondale, NY
1d$150,000 - $170,000Remote

About The Position

Join the team leading the next evolution of virtual care. At Teladoc Health, you are empowered to bring your true self to work while helping millions of people live their healthiest lives. Here you will be part of a high-performance culture where colleagues embrace challenges, drive transformative solutions, and create opportunities for growth. Together, we’re transforming how better health happens. Summary of Position The Senior Data Engineer will serve as a foundational contributor to our data platform modernization initiative, accelerating the organization's transition to a cloud-native, AI-augmented engineering model. This role is responsible for designing, building, and operationalizing production-grade data pipelines on Snowflake using dbt, with a strong emphasis on engineering rigor, SDLC discipline, and AI-assisted development practices anchored in GitHub. The ideal candidate brings deep expertise in modern ELT architecture and treats software engineering best practices - version control, automated testing, CI/CD, peer review, and documentation - as non-negotiable standards. Equally important is the ability to leverage AI-powered developer tooling (GitHub Copilot, Copilot Chat, and emerging agentic workflows) to accelerate delivery, reduce technical debt, and elevate team capability. As a senior practitioner, you will partner with Marketing, Data Science, and Platform Engineering teams to deliver data solutions that enable advanced segmentation, personalization, and predictive analytics at scale — while actively contributing to the codification of SDLC practices, coding standards, and data governance across the department.

Requirements

  • 8+ years of experience in data engineering, analytics engineering, or SQL-intensive data development roles.
  • 3+ years of hands-on, production experience with dbt (Core or Cloud) and Snowflake as primary platforms.
  • Deep GitHub expertise (branching, pull request-based development, Actions for CI/CD).
  • Strong SQL expertise including query optimization, window functions, recursive CTEs, and performance tuning on large-scale cloud data warehouses.
  • Experience with ELT orchestration (Airflow/Astronomer, dbt Cloud, or equivalent).
  • Familiarity with CDC patterns, event streaming architectures (Kafka, Kinesis, or equivalent), and real-time data ingestion use cases.
  • Proven track record migrating legacy ETL/SQL logic (Redshift, Oracle, Teradata) into modern ELT architectures with full test coverage and documentation.
  • Excellent communication skills with the ability to convey technical concepts clearly to non-technical stakeholders.

Nice To Haves

  • Experience using GitHub Copilot, Copilot Chat, and/or other agentic coding tools (e.g., Claude Code, Cursor, Copilot Workspace) to accelerate development, including creating internal prompt libraries and AI workflow patterns, is highly preferred.
  • Experience with Salesforce Data360 (Data Cloud) or Salesforce Marketing Cloud for audience segmentation and activation is a plus.
  • Familiarity with data quality and observability platforms such as Metaplane, Monte Carlo, or Great Expectations.
  • Knowledge of HIPAA compliance requirements and healthcare data standards (HL7, FHIR) in data engineering contexts.
  • Snowflake SnowPro Core or Advanced certification, or dbt Certified Developer credential.

Responsibilities

  • Data Engineering & Architecture Build and maintain scalable ELT pipelines using dbt Core/Cloud on Snowflake.
  • Develop performant data models with optimized materializations and incremental processing.
  • Lead migration from legacy Redshift pipelines to Snowflake/dbt, ensuring functional parity, improved performance, and full test coverage.
  • Implement data quality tests, observability frameworks, and CDC/streaming ingestion patterns.
  • SDLC & Engineering Standards Enforce GitHub‑based development practices (branching, PR review, CI/CD, versioning).
  • Maintain repo structure, documentation, and automated workflows.
  • Build CI/CD pipelines using GitHub Actions for dbt build/test/deploy.
  • Support IaC practices for Snowflake resource provisioning.
  • AI‑Assisted Development Use GitHub Copilot/Copilot Chat to accelerate coding, testing, documentation, and review.
  • Analyze and modernize legacy SQL with AI tooling.
  • Experiment with emerging agentic development tools (Copilot Workspace, Claude Code).
  • Create and share prompt libraries and AI workflow patterns that improve team velocity and consistency in pipeline development, testing, and documentation tasks.
  • Data Governance & Operational Excellence Maintain thorough documentation of all pipelines, data models, and business logic in dbt docs, Confluence, or equivalent tools — ensuring knowledge is accessible and transferable.
  • Monitor pipeline performance and resolve SLA issues using observability tools (Metaplane, dbt Cloud, Snowflake Query History).
  • Participate in architecture processes and cross-functional design reviews for data platform decisions.

Benefits

  • In addition to a base salary, this position is eligible for a performance bonus and benefits (subject to eligibility requirements) listed here: Teladoc Health Benefits 2026.
  • We follow a Flexible Vacation Policy, intended for rest, relaxation, and personal time.
  • All time off must be approved by your manager prior to use.
  • You will also receive 80 hours of Paid Sick, Safe, and Caregiver Leave annually.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service