Sr. Data Engineer (Hybrid or Remote)

Mashura LLCScottsdale, AZ
Hybrid

About The Position

We are seeking a Senior Data Engineer to help shape and scale our modern data platform built on Microsoft Fabric. We have established the foundation of a lakehouse architecture, ingesting device and SaaS application data from thousands of medical dispensing cabinets deployed globally. The next phase is expanding and hardening this platform to support: Product intelligence and AI-driven insights, Customer-facing analytics and reporting, Internal operational efficiency and automation, Scalable, trustworthy enterprise data models. This role is both strategic and hands-on. You will design, build, and optimize scalable data pipelines and semantic models while helping define the future of our analytics and AI capabilities.

Requirements

  • 5+ years of professional experience in data engineering or analytics engineering
  • Hands-on experience with Microsoft Fabric (lakehouse, pipelines, notebooks, semantic models)
  • Strong experience with PySpark and distributed data processing
  • Deep understanding of modern data architectures (lakehouse, medallion patterns)
  • Experience designing scalable ETL/ELT pipelines
  • Strong SQL skills and data modeling expertise
  • Experience working with event-driven or streaming data systems
  • Strong collaboration and communication skills
  • Comfortable working cross-functionally with engineering, product, and business teams

Nice To Haves

  • Experience with IoT or device-generated data
  • Experience integrating operational systems (SaaS platforms, APIs) into analytics environments
  • Familiarity with Azure services (Event Hubs, Cosmos DB, Azure SQL)
  • Exposure to AI/ML workflows and feature engineering
  • Experience implementing data governance and lineage tooling

Responsibilities

  • Architect and build scalable data pipelines using Microsoft Fabric and PySpark
  • Design and evolve our lakehouse architecture
  • Develop robust semantic models and analytics-ready datasets
  • Implement data quality validation, monitoring, and observability practices
  • Optimize performance and cost efficiency across Fabric workloads
  • Collaborate with backend engineers to define event schemas and data contracts
  • Enable product intelligence initiatives, including AI/ML experimentation
  • Contribute to data governance and security best practices
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service