Senior Data Engineer

PerformMilton, GA

About The Position

The data engineer role is changing. The traditional pattern—build a pipeline, hand it off, wait for someone else to figure out if the data is right—doesn’t work anymore. The engineers who create the most value now are the ones who go deep into the business domain, understand the problem firsthand, and use AI tools to move from question to working solution in hours instead of sprints. MDVIP’s Analytics team is looking for a Senior Data Engineer who operates as a hybrid technical product owner: someone who builds and maintains the data platform on Azure Databricks, but who also sits with business stakeholders, interrogates the problem, and owns the outcome end-to-end. You won’t wait for requirements to be handed to you. You’ll go find them, validate them, and ship the solution—using Claude Code and agentic development patterns to collapse the distance between understanding a business problem and solving it in production. This is what it means to shift the engineer left into the business domain. You’re not a pipeline builder waiting for a ticket. You’re the person who understands how physician network growth, member engagement, and operational performance actually work—and who builds the data infrastructure that makes the entire Analytics team faster, sharper, and more impactful.

Requirements

  • BS in Computer Science, Data Science, or related field; 6+ years in data engineering or a hybrid data engineering/analytics role.
  • Deep hands-on experience with Azure Databricks—notebooks, Delta Lake, Unity Catalog, and production-scale pipelines.
  • Strong Python and SQL; experience with PySpark and distributed data processing.
  • Built and operated data pipelines that serve analytics, ML models, and operational systems—not just batch ETL jobs.
  • Worked directly with business stakeholders to define requirements, shape data products, and deliver measurable outcomes.
  • Active, daily use of AI coding tools (Claude Code, Copilot, or similar) as a force multiplier.
  • Strong communication skills with a track record of presenting technical work to non-technical audiences.

Responsibilities

  • Design, build, and operate MDVIP’s data platform on Azure Databricks—ingestion, transformation, storage, and serving layers that power analytics, AI models, and operational reporting.
  • Build and maintain data pipelines across MDVIP’s ecosystem: Salesforce, SQL Server, Snowflake, third-party sources, and the new cloud-native payments platform.
  • Engineer for quality and trust—validation checks, anomaly detection, lineage tracking, and documentation that ensure every downstream consumer can rely on the data.
  • Write clean, version-controlled, production-grade code. Think like a software engineer building a product, not a script runner maintaining jobs.
  • Partner directly with business stakeholders across physician growth, member services, finance, and operations to understand how data drives decisions—then build for those decisions, not for abstract requirements.
  • Act as a technical product owner for your domain areas: own the backlog, prioritize based on business impact, and ship iteratively without waiting for a PM to sequence your work.
  • Translate ambiguous business questions into data models, feature tables, and curated datasets that analysts and data scientists can build on immediately.
  • Close the loop—follow your data through to the dashboard, the model, or the operational workflow and validate that it’s actually driving the outcome.
  • Use Claude Code and agentic development as your primary workflow—AI-driven pipeline generation, automated testing, rapid prototyping—to ship at a pace that would be impossible with traditional approaches.
  • Build data infrastructure that is AI-ready: well-documented, semantically clear, and structured so that AI tools and agents can reason over it effectively.
  • Scout, evaluate, and adopt emerging AI tools and platforms that make the data team faster—separating real value from hype with hands-on testing.
  • Share what you learn. Document patterns, run demos, and help the broader team adopt AI-first workflows with confidence.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service