Data Platform Engineer- 6 Month Assignment

MAXhealthLake Sarasota, FL
Hybrid

About The Position

MaxHealth is seeking a highly skilled Data Platform Engineer (DBA) to join our data engineering team to help in designing, modernizing, and supporting a cloud-based data platform, with a focus on transitioning legacy SSIS-driven ELT processes to Azure Data Factory (ADF) and advancing toward a Microsoft Fabric-enabled architecture. This role will help establish a scalable data lake foundation, enable modern analytics capabilities, and ensure data reliability through strong monitoring, alerting, and database administration practices. The position is critical to improving data quality, operational stability, and enterprise data accessibility.

Requirements

  • 4+ years of experience in data engineering, data platform, or DBA roles
  • Strong experience with: Azure Data Factory (ADF)
  • SQL Server and/or Azure SQL
  • SSIS (with migration/modernization experience)
  • Hands-on experience with data lake architectures and cloud storage patterns
  • Experience implementing monitoring and alerting frameworks
  • Strong SQL skills and experience with scripting (Python, PowerShell, or similar)
  • Understanding of ELT/ETL patterns and modern data architectures

Nice To Haves

  • Experience with Microsoft Fabric (Lakehouse, Warehouse, Data Factory)
  • Familiarity with Delta Lake and medallion architecture design
  • Experience with Azure Data Lake Storage Gen2 or OneLake
  • Experience with observability tools (Azure Monitor, Log Analytics)
  • CI/CD and infrastructure-as-code experience (ARM, Terraform)
  • Experience in healthcare or regulated environments

Responsibilities

  • Lead migration of legacy SSIS pipelines to Azure Data Factory (ADF)
  • Support evolution toward Microsoft Fabric (Data Factory, Lakehouse, and Warehouse experiences)
  • Redesign ELT processes to align with lakehouse and medallion architecture patterns (bronze/silver/gold)
  • Build scalable, reusable, and parameter-driven data pipelines
  • Integrate diverse data sources into a centralized cloud platform
  • Design and implement end-to-end monitoring across ADF, Fabric, and database environments
  • Establish alerting frameworks for pipeline failures, data quality issues, and performance degradation
  • Partner with Data Operations to support intake, triage, and incident management (L2 support model)
  • Define and track SLAs/SLIs for pipeline performance and data availability
  • Perform root cause analysis and implement preventative controls
  • Perform core SQL Server / Azure SQL / Fabric Warehouse administration: Performance tuning and query optimization, Indexing strategies and maintenance, Backup, recovery, and disaster recovery planning, Capacity and workload management
  • Ensure database security, access controls, and compliance adherence
  • Embed automated data quality checks within pipelines and lake layers
  • Monitor completeness, accuracy, and timeliness of critical datasets
  • Partner with business stakeholders to resolve data issues and improve trust in data
  • Contribute to scalable, secure, and governed data architecture
  • Implement CI/CD pipelines for ADF and Fabric deployments (Azure DevOps)
  • Enforce standards for pipeline development, naming conventions, and code management
  • Document data lineage, architecture, and operational runbooks

Benefits

  • Possibility of opportunity for extension into a permanent position

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

1-10 employees

© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service