Senior Data Engineer/ Architecture (ADF & Databricks)

United Vein & Vascular CentersTampa, FL
Remote

About The Position

The Senior Data Engineer will report to the Senior Director of Data & AI and will help ensure UVVC’s implementation and use of technology meets the business needs and supports the company mission. As key member of the Data team, you’ll collaborate with other leaders and their teams across the entire organization. In this part of your role, you will create opportunities for strategy, leverage and technology direction solutions across UVVC business units. This position is open to remote candidates.

Requirements

  • 6+ years of hands-on data engineering experience.
  • Strong expertise in: Azure Data Factory (ADF)—pipelines, mapping data flows, IR management. Azure Databricks—notebooks, Spark, Delta Lake, workflow orchestration. SQL—complex logic, performance optimization, analytics queries, stored procedures.
  • Proficiency in building scalable cloud ETL/ELT solutions.
  • Deep expertise in Lakehouse architecture (Medallion: Bronze/Silver/Gold) and Delta Lake optimization techniques.
  • Proven experience operating at a Databricks Architect level, designing and implementing enterprise-scale data platforms.
  • Strong understanding of Databricks Unity Catalog, data governance, and security models.
  • Experience defining data platform standards, frameworks, and best practices across teams.
  • Strong knowledge of data modeling, governance, and distributed processing.

Nice To Haves

  • Experience with AI/ML workflows, feature engineering, or model enablement.
  • Healthcare data experience (EHR/EMR, HL7, FHIR, claims, RCM).
  • Experience leading or mentoring teams on Databricks architecture and best practices.
  • Experience building multi-workspace or multi-environment Databricks strategies (Dev/Test/Prod).
  • CI/CD experience (Azure DevOps, GitHub Actions, Databricks Repos).
  • Python experience for ETL logic, automation, or ML support.
  • Familiarity with real-time processing (Structured Streaming) within Databricks
  • Experience with Azure Synapse or equivalent warehousing technologies

Responsibilities

  • Design, build, and optimize end-to-end ETL/ELT pipelines using Azure Data Factory and Databricks.
  • Develop robust ingestion frameworks for batch and streaming data from APIs, databases, SaaS platforms, and internal systems.
  • Create scalable and architecturally sound data transformation frameworks using Delta Lake (medallion architecture), Spark, and SQL, aligned with enterprise Lakehouse standards.
  • Implement CI/CD, parameterization, triggers, and pipeline automation best practices
  • Architect, manage, and optimize enterprise data environments across ADLS, Azure SQL, and Databricks, including cluster design, cost governance, and workload isolation strategies.
  • Conduct performance tuning, cluster scaling, monitoring, and cloud cost optimization.
  • Implement DataOps practices including testing, version control, monitoring, and documentation.
  • Build data validation, auditing, and error-handling frameworks to ensure accuracy and consistency.
  • Maintain documentation, naming standards, metadata structures, and governance best practices.
  • Troubleshoot complex data issues and deliver sustainable technical solutions.
  • Design and implement enterprise-grade Databricks Lakehouse architecture (Bronze, Silver, Gold layers).
  • Define and enforce data engineering standards, naming conventions, and architectural patterns across all pipelines.
  • Lead the architecture of Delta Lake design patterns, including partitioning, optimization, and data lifecycle management.
  • Establish scalable cluster strategies, job orchestration frameworks, and workspace organization.
  • Drive performance optimization strategies across large-scale distributed data workloads.
  • Own security and governance frameworks within Databricks (Unity Catalog, access controls, data lineage).
  • Evaluate and implement new Databricks capabilities and ensure alignment with enterprise data strategy.
  • Work closely with clinical, finance, RCM, operations, and IT teams to understand business needs.
  • Provide technical guidance on data engineering patterns and platform capabilities.
  • Clearly communicate progress, risks, and technical decisions to stakeholders and leadership
  • Demonstrate and promote a work culture committed to UVVC’s Core Values: Understanding, Nurturing, Ingenuity, Trust, Excellence, and Diversity.
  • Demonstrate behaviors that are consistent with UVVC’s Standards of Conduct as outlined in our Employee Handbook.
  • Uphold confidentiality and compliance standards in accordance with UVVC policies, the Health Insurance Portability and Accountability Act (HIPAA), and other applicable laws and regulations. PHI is a top priority of our organization.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service