Senior Data Engineer

CarrierAtlanta, GA

About The Position

Carrier Global Corporation, a global leader in intelligent climate and energy solutions, is committed to creating innovations that bring comfort, safety, and sustainability to life. Through cutting-edge advancements in climate solutions such as temperature control, air quality, and transportation, we improve lives, empower critical industries, and ensure the safe transport of food, lifesaving medicines, and more. Since inventing modern air conditioning in 1902, we lead with purpose: enhancing the lives we live and the world we share. We continue to lead because of our world-class, inclusive workforce that puts the customer at the center of everything we do. The Senior Data Engineer enables the delivery of modern, scalable data solutions across commercial, operational, and enterprise domains within CSA engagements. The role focuses on designing and building robust data pipelines, integrating diverse data sources, and supporting open lake-house and medallion-layered architectures. This engineer works closely with solution architects, data product leads, and customer teams to translate requirements into production-ready data assets that power analytics, reporting, and data products across the organization.

Requirements

  • Bachelor’s degree.
  • 5+ years of experience in data engineering.
  • 5+ years of experience in open lakehouse architecture in Azure, GCP, AWS and/or Snowflake.
  • 5+ years of experience creating automated data pipelines and performing data modeling.
  • 5+ years of experience writing SQL, Python, and modern ETL/ELT frameworks.
  • Strong experience designing and building data pipelines, transformations, and data models.
  • Proficiency in SQL, Python, and modern ETL/ELT frameworks.
  • Experience implementing open lakehouse architectures and medallion-layered data products.
  • Strong understanding of data integration patterns, orchestration, and distributed data processing.
  • Experience working with ERP and enterprise systems, with an emphasis on data extraction, modeling, and harmonization.
  • Experience working in customer-facing delivery environments, supporting CSA or similar solution-oriented teams.
  • Ability to work across cross-functional engineering, product, and business teams.
  • Strong communication skills for explaining technical concepts and influencing architectural decisions.
  • Experience working within Agile delivery models.

Nice To Haves

  • Bachelor’s of Science in Computer Science, Engineering, or technical related field.
  • Experience with cloud platforms and modern data ecosystems, including: Data replication tools for enterprise systems.
  • Background supporting commercial, operational, or enterprise analytics use cases.
  • Familiarity with CI/CD, infrastructure-as-code, and modern DevOps practices.
  • Experience mentoring junior and early career engineers.
  • Knowledge of manufacturing, logistics, and supply chain domain.
  • Experience working with ERP and CRM platforms such as Salesforce.

Responsibilities

  • Design, build, and optimize data ingestion, transformation, and delivery pipelines that support batch, micro-batch, and streaming workloads.
  • Develop ETL/ELT workflows that integrate data from enterprise systems, operational platforms, external sources, and commercial applications.
  • Implement scalable data processing patterns that support open lakehouse architectures and medallion-layered data products.
  • Ensure pipelines are reliable, maintainable, cost-efficient, and aligned with enterprise engineering standards.
  • Design data models that support analytics, reporting, and data product consumption across raw, refined, and curated layers.
  • Apply medallion architecture principles to ensure data quality, reusability, and consistency across domains.
  • Contribute to architectural decisions that balance performance, scalability, governance, and operational simplicity.
  • Support integration of ERP and enterprise systems, ensuring data consistency and alignment with business processes.
  • Implement data quality checks, validation rules, and monitoring frameworks to ensure accuracy and reliability.
  • Apply foundational data governance practices including lineage, metadata management, cataloging, and access controls.
  • Collaborate with governance and data product teams to ensure compliance with enterprise standards and regulatory requirements.
  • Partner with CSA teams, solution architects, and data product leads to translate business requirements into technical specifications.
  • Work closely with customer engineering and analytics teams to support onboarding, adoption, and operationalization of delivered pipelines.
  • Participate in Agile delivery processes, providing technical input on sizing, sequencing, and implementation planning.
  • Communicate technical concepts clearly to both technical and non-technical stakeholders.

Benefits

  • Health Care Benefits: Medical, Dental, Vision
  • Wellness incentives
  • Retirement Benefits
  • Paid vacation days, up to 15 days
  • Paid sick days, up to 5 days
  • Paid personal leave, up to 5 days
  • Paid holidays, up to 13 days
  • Birth and adoption leave
  • Parental leave
  • Family and medical leave
  • Bereavement leave
  • Jury duty leave
  • Military leave
  • Purchased vacation
  • Short-term and long-term disability
  • Life Insurance and Accidental Death and Dismemberment
  • Health Savings Account
  • Health Care Spending Account
  • Dependent Care Spending Account
  • Tuition Assistance

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Senior

Number of Employees

5,001-10,000 employees

© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service