Senior DataOps Engineer- Remote US

Smile Digital Health
Remote

About The Position

The DataOps Engineer owns data analytics infrastructure at Smile Digital Health: from build and deployment pipelines through to large-scale data processing and analytics environments. This role bridges DevOps and data engineering by provisioning and hardening cloud infrastructure, automating CI/CD workflows, and operating the Spark-based and Big Data systems that power both our internal platform and customer deployments.

Requirements

  • 6+ years in a DevOps, DataOps, or data platform engineering role.
  • Hands-on experience with Apache Spark and at least one managed Spark platform (Databricks, AWS EMR, GCP Dataproc, or equivalent).
  • Proficiency in Python; solid working knowledge of Java or Scala applications.
  • Experience with pipeline orchestration tools such as Apache Airflow, Prefect, or similar.
  • Strong CI/CD experience with GitLab CI, Jenkins, or GitHub Actions.
  • Infrastructure-as-code proficiency with Terraform, Ansible, or equivalent (AWS CloudFormation and Azure ARM Templates are a plus).
  • Solid experience operating Linux systems and public cloud environments (AWS, Azure, GCP, or OCI).
  • Familiarity with Kubernetes or other container orchestration platforms for data workloads.
  • Ability to manage multiple workstreams in parallel with strong attention to delivery timelines.
  • Customer-first mindset with strong written and verbal communication skills.

Responsibilities

  • Build, maintain, and run CI/CD pipelines and infrastructure-as-code for the Smile Digital Health platform and associated services.
  • Provision, configure, and operate cloud-based Spark clusters and distributed data processing environments, including hands-on work with orchestration tools such as Airflow, Databricks, or EMR.
  • Write, test, and maintain data pipelines on the same infrastructure you manage, from environment setup through to production monitoring.
  • Design and maintain scalable, secure infrastructure templates and deployment automation across AWS, Azure, GCP, or OCI environments.
  • Investigate and resolve data pipeline and integration issues, providing root-cause analysis and durable fixes.
  • Monitor running systems and pipelines, respond to incidents, tune performance, and manage cloud infrastructure costs.
  • Foster an Everything-as-Code culture and promote DataOps best practices across the team.
  • Assist developers and engineers with deployments and builds as needed.

Benefits

  • Remote Work Environment
  • Flexible Time Away From Work Policy including PTO, Personal and Sick Days
  • Competitive Salary and Health/Medical Benefits
  • RRSP/TFSA/401K Employee Contribution
  • Life and Disability
  • Employee Assistance Program
  • FHIR Study Program and Skillsoft Learning
  • Super HAPI Fun Club
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service