Cloud Data Engineer

RadNetRochester Hills, MI
7h

About The Position

Artificial Intelligence; Advanced Technology; The very best in patient care. With decades of expertise, RadNet is Leading Radiology Forward. With dynamic cross-training and advancement opportunities in a team-focused environment, the core of RadNet’s success is its people with the commitment to a better healthcare experience. When you join RadNet as a Cloud Data Engineer, you will be joining a dedicated team of professionals who deliver quality, value, and access in the 21st century and align all stakeholders- patients, providers, payors, and regulators to achieve the best clinical outcomes.

Requirements

  • 5+ years in data engineering with GCP experience.
  • Strong Python and SQL.
  • Hands-on with BigQuery, Dataflow, Pub/Sub, and Dataproc.
  • Experience with dbt/Airflow/Composer for orchestration.
  • Experience integrating APIs and SaaS sources using Airbyte.
  • Knowledge of data formats (Parquet, Avro, JSON, Delta Lake).

Nice To Haves

  • Experience with MongoDB Atlas or Neo4j AuraDB preferred.
  • Familiarity with vector search or graph analytics preferred.
  • Healthcare data experience (HL7, FHIR) preferred.

Responsibilities

  • Own end-to-end development of cloud ELT pipelines — from CDC/API ingestion (Airbyte) and batch loads (Cloud Build/Composer) into BigQuery landing tables to dbt transformations that publish curated Silver/Gold datasets.
  • Design and document Medallion-style models (Bronze → Silver → Gold) in dbt with clear naming, sources, exposures, and owners; maintain dbt project structure, packages, and environments.
  • Implement data quality with dbt tests (generic/custom), freshness checks, and documentation (dbt docs); publish artifacts to the Analytics Hub/catalog for discoverability.
  • Optimize BigQuery for dbt: partitioning, clustering, materializations (incremental/merge), cost controls (slots/quotas), job monitoring, and query performance tuning.
  • Support streaming use cases by landing real time data via Pub/Sub with BigQuery subscriptions (or Dataflow/Beam where required) and shaping near real time models with dbt incremental strategies.
  • Build orchestration and CI/CD for dbt using dbt Cloud or dbt Core (Cloud Build/Composer), with code review, automated tests, and artifact promotion across Dev/Test/Prod.
  • Partner with BI and AI/ML teams to expose trusted datasets and features; publish contract backed schemas and semantic conventions aligned to enterprise KPIs.
  • Migrate and reconcile legacy on prem pipelines (e.g., SQL Server CDC) into GCP; validate row level fidelity and handle late arriving and schema evolution scenarios.
  • Implement security, privacy, and governance (IAM, CMEK, BQ row/object level security, HIPAA); contribute to auditable data lineage (dbt exposures + warehouse lineage).
  • Establish monitoring/alerting for pipeline reliability (Cloud Monitoring/Logging), with SLAs/SLOs, retries/backfills, and incident runbooks; participate in on call as needed.

Benefits

  • Comprehensive Medical, Dental and Vision coverages.
  • Health Savings Accounts with employer funding.
  • Wellness dollars
  • 401(k) Employer Match
  • Free services at any of our imaging centers for you and your immediate family.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service