Senior Data Engineer

Trane TechnologiesBloomington, MN
4dHybrid

About The Position

At Trane TechnologiesTM and through our businesses including Trane® and Thermo King®, we create innovative climate solutions for buildings, homes, and transportation that challenge what’s possible for a sustainable world. We're a team that dares to look at the world's challenges and see impactful possibilities. We believe in a better future when we uplift others and enable our people to thrive at work and at home. We are looking for a highly skilled Senior Data Engineer with deep expertise in Azure data technologies to own and evolve the data foundation of our enterprise SaaS and next-generation telematics platform. You will be responsible for all aspects of data ingestion, transformation, storage, analytics, quality, and performance across millions of IoT events per day. This is a hands-on, high-impact role that partners closely with architecture, backend engineering, product, project management, and leadership teams. To be successful in this role, you must be able to hit the ground running, operate independently, and bring strong experience building scalable, reliable, cost-optimized cloud data solutions.

Requirements

  • Bachelor’s Degree in Computer Science, Information Systems or Mathematics.
  • 5+ years of professional data engineering experience.
  • Strong Azure expertise, including hands-on experience with:
  • Cosmos DB (required)
  • Azure SQL
  • Azure Data Explorer (ADX)
  • ADLS Gen2
  • Microsoft Fabric (or equivalent modern cloud DWH experience)
  • Strong proficiency in Python, SQL, and Spark.
  • Experience with ETL/ELT processes at scale (structured + semi-structured + JSON).
  • Experience with real-time ingestion (Event Hub, Kafka, or similar).
  • Strong understanding of distributed systems, partitioning, and data modeling at scale.
  • Hands-on experience supporting production SaaS platforms.
  • Ability to work cross-functionally (architecture, development, DevOps, PMs, leadership).
  • Self-starter who can operate independently and “hit the ground running”.

Nice To Haves

  • Experience in IoT / telematics, fleet management, or large-scale device ingestion.
  • Familiarity with H3 geospatial indexing or geospatial data pipelines.
  • Knowledge of Dapr, .NET microservices, or Java microservices (to support shared data contracts).
  • Familiarity with Orleans, event sourcing, or CQRS patterns.
  • Experience with ML/AI data prep workflows for predictive maintenance, anomaly detection, etc.
  • Azure certification(s) such as:
  • DP-203: Azure Data Engineer Associate.
  • DP-500: Azure Enterprise Data Analyst.

Responsibilities

  • Data Platform Architecture & Development:
  • Design, build, and maintain Azure-native data pipelines using:
  • Azure Data Factory / Synapse Pipelines
  • Azure Databricks (optional but beneficial)
  • Azure Functions
  • Event Hub / Service Bus
  • Logic Apps
  • Architect and optimize Cosmos DB (partitioning, indexing, schema strategy, TTL, throughput tuning).
  • Build and manage Azure Data Explorer (ADX) clusters and ingestion maps.
  • Implement Microsoft Fabric workloads (Lakehouse, Real-Time Analytics, Data Warehouse).
  • Build and evolve Data Lakes (ADLS Gen2) including medallion architectures (Bronze/Silver/Gold).
  • Design and manage Azure SQL schemas, stored procedures, ETL jobs, and performance optimization.
  • ETL / ELT & Data Transformation:
  • Create high-throughput data ingestion pipelines from IoT devices, gateways, and microservices.
  • Build reusable transformation pipelines using Python, Spark, SQL, Fabric dataflows, or ADF mapping dataflows.
  • Implement CI/CD practices for data pipelines using Azure DevOps or GitHub.
  • Build data quality, validation, and anomaly detection processes.
  • Data Modeling & Storage:
  • Design highly scalable data models for time-series, telemetry, alarms, events, geospatial lookups, and fleet/asset metadata.
  • Implement optimal partitioning strategies for Cosmos DB + ADX + Fabric Lakehouse.
  • Manage the data lifecycle from hot path → warm path → cold storage.
  • Work with analytics and backend engineering to deliver performant APIs and dashboards.
  • Cross-Functional Collaboration:
  • Collaborate with architects on the design of the new telematics platform.
  • Work closely with software engineers to support microservice data contracts and ingestion patterns.
  • Support business stakeholders with analytics needs.
  • Partner with project managers to deliver data milestones on time.
  • Operations & Reliability:
  • Own reliability of data pipelines for the current SaaS platform.
  • Monitor, tune, and optimize costs across Azure data services.
  • Ensure proper governance, security, compliance, and access control.
  • Create runbooks, documentation, and knowledge-sharing for the engineering org.

Benefits

  • Benefits kick in on DAY ONE for you and your family, including health insurance and holistic wellness programs that include generous incentives – WE DARE TO CARE!
  • Family building benefits include fertility coverage and adoption/surrogacy assistance.
  • 401K match up to 6%, plus an additional 2% core contribution = up to 8% company contribution.
  • Paid time off, including in support of volunteer and parental leave needs.
  • Educational and training opportunities through company programs along with tuition assistance and student debt support.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service