IT Data Developer

Tempur Sealy InternationalTrinity, NC
$81,000 - $116,000

About The Position

Tempur Sealy. Iconic brands. Storied history. Industry-leading innovation. Tempur Sealy is committed to improving the sleep of more people, every night, all around the world. As a leading designer, manufacturer, distributor, and retailer of bedding products worldwide, we know how crucial a good night of sleep is to overall health and wellness. Utilizing over a century of knowledge and industry-leading innovation, we deliver award-winning products that provide breakthrough sleep solutions to consumers in over 100 countries. Our highly recognized brands include Tempur-Pedic®, Sealy® and Stearns & Foster®. We hire people who have a passion for helping others find their best night’s sleep. No matter what stage of your career, you can build your future at Tempur Sealy! We are seeking a Data Developer to design, build, and support modern data pipelines and data warehouse/lakehouse solutions. This role focuses on SQL‑centric development across Microsoft Azure, Microsoft Fabric, and Snowflake, enabling trusted analytics and reporting for the business. The Data Developer will collaborate with data architects, BI developers, and business stakeholders to deliver high‑quality, well‑governed datasets that scale Salary Range: $81,000 - $116,000

Requirements

  • Bachelor’s degree in Computer Engineering or Computer Science or Equivalent Work Experience.
  • 3–6 years of experience in data development or data engineering
  • Strong SQL skills with experience creating and maintaining databases objects (Tables, Views, Stored Procedures, Indexes & Functions), optimizing queries and data transformations
  • Hands‑on experience with Azure data services (ADF, Azure SQL, ADLS, Synapse and/or Fabric)
  • Working experience with Microsoft Fabric (Lakehouse, Warehouse, OneLake, or notebooks)
  • Hands‑on experience with Snowflake as a data warehouse
  • Solid understanding of data warehousing concepts, ETL/ELT, and dimensional modeling
  • Experience using Git and following basic CI/CD or structured deployment practices

Nice To Haves

  • Experience integrating data from Microsoft ERP platforms, especially Dynamics 365 Finance & Operations (D365 F&O) or related exports/data lake patterns
  • Python or PySpark for data transformations or automation
  • dbt or comparable transformation frameworks
  • Experience with data governance tools (e.g., Microsoft Purview)
  • Snowflake certifications or Microsoft Azure/Fabric certifications (DP‑203, DP‑600)

Responsibilities

  • Develop and maintain ETL/ELT pipelines using Azure Data Factory and/or Fabric Data Factory to ingest data from databases, files, APIs, and SaaS
  • Build and optimize curated data layers in Microsoft Fabric (OneLake, Lakehouse, Warehouse) and Snowflake following modern patterns (e.g., Bronze/Silver/Gold)
  • Write, tune, and troubleshoot advanced SQL for transformations, validations, and analytics workloads
  • Design and maintain dimensional data models (facts/dimensions) to support BI and self‑service analytics (Power BI or similar)
  • Monitor and improve data quality, reliability, and performance, including incremental loads and cost optimization
  • Apply security and governance best practices (RBAC, controlled data access, basic lineage/metadata documentation)
  • Collaborate in an Agile environment, using DevOps, Git and work‑tracking tools to deliver well‑documented, production‑ready solutions
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service