Senior Data Platform Engineer

ResmedSan Diego, CA
1d

About The Position

Global Technology Solutions (GTS) at ResMed is a division dedicated to creating innovative, scalable, and secure platforms and services for patients, providers, and people across ResMed. The primary goal of GTS is to accelerate well-being and growth by transforming the core, enabling patient, people, and partner outcomes, and building future-ready operations. The strategy of GTS focuses on aligning goals and promoting collaboration across all organizational areas. This includes fostering shared ownership, developing flexible platforms that can easily scale to meet global demands, and implementing global standards for key processes to ensure efficiency and consistency. At ResMed, we don’t just build technology we create solutions that transform lives. As a global leader in connected devices and digital health, we empower millions of people worldwide to sleep, breathe, and live better lives. We are looking for a Senior Data Platform Engineer to help build and scale the next generation of ResMed’s data ecosystem. This is a senior individual contributor role for an engineer who combines strong software engineering fundamentals with deep data engineering and analytics experience. You will design, build, and operate reliable, scalable data systems that power analytics, data products, and advanced AI/ML use cases across the organization. If you enjoy solving complex data problems end to end, writing high-quality production code, and working closely with product, analytics, and data science partners, this role offers the opportunity to make a meaningful global impact at scale. Location: San Diego What You’ll Do As a Senior Data Platform Engineer, you will be responsible for delivering robust, production-grade data solutions and contributing to the technical excellence of ResMed’s data platform. You will:

Requirements

  • Bachelor’s degree in a STEM field or equivalent practical experience.
  • Significant hands-on experience as a data engineer or senior software engineer working on data-intensive systems (typically 5–8+ years).
  • Strong SQL expertise and experience with data modeling on large-scale analytical platforms (Snowflake preferred).
  • Proven experience building and operating production data pipelines using Python and cloud services.
  • Proficiency with dbt or similar transformation and analytics engineering tools.
  • Solid software engineering fundamentals, including system design, debugging, performance optimization, and maintainable code practices.
  • Experience with Git/GitHub workflows, including pull requests, code reviews, and collaborative development.
  • Hands-on experience building or working with CI/CD pipelines (GitHub Actions preferred), including automated testing and deployments.
  • Ability to work effectively across both data engineering and analytics engineering responsibilities.
  • Strong hands-on experience building and operating data systems on AWS, including designing cloud-native architectures and working with services such as S3, IAM, EC2/ECS/EKS, Lambda, Glue, EMR, or related AWS data and compute services.

Nice To Haves

  • Experience with workflow orchestration tools such as Dagster, Airflow, or similar.
  • Familiarity with streaming or event-driven systems (Kafka, Flink, Kinesis).
  • Experience supporting ML/AI workflows or integrating ML models into data products.
  • Master’s degree in a STEM field.
  • Prior experience working in healthcare, regulated environments, or large-scale enterprise data platforms.

Responsibilities

  • Design, build, and maintain scalable data pipelines for ingestion, transformation, and delivery using Python, SQL, Spark, APIs, and modern cloud-native tools.
  • Develop high-quality analytics and data models in Snowflake using dbt or similar frameworks, with a focus on performance, correctness, and maintainability.
  • Apply strong software engineering practices to data systems, including modular design, testing, code reviews, and version control.
  • Implement automation, monitoring, and observability to ensure reliable and resilient data pipelines in production.
  • Collaborate closely with product managers, analytics engineers, data scientists, and application engineers to deliver data products that drive business and clinical outcomes.
  • Support advanced analytics and ML use cases by building feature pipelines and data foundations for classical ML models and emerging AI-driven workloads.
  • Contribute to shared standards, patterns, and best practices across the data engineering organization through hands-on contributions and technical collaboration.

Benefits

  • comprehensive medical, vision, dental, and life, AD&D, short-term and long-term disability insurance, sleep care management, Health Savings Account (HSA), Flexible Spending Account (FSA), commuter benefits, 401(k), Employee Stock Purchase Plan (ESPP), Employee Assistance Program (EAP), and tuition assistance
  • Employees accrue fifteen days Paid Time Off (PTO) in their first year of employment, receive 11 paid holidays plus 3 floating days and are eligible for 14 weeks of primary caregiver or two weeks of secondary caregiver leave when welcoming new family members.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service