AI/ML Supply Chain Engineer

QuidelOrthoSan Diego, CA
7d$85,000 - $105,000Hybrid

About The Position

At QuidelOrtho, we’re advancing the power of diagnostics for a healthier future for all. Join our mission as our next AI/ML Supply Chain Engineer. You will be responsible for designing, building, and optimizing data pipelines and infrastructure using Databricks to support AI and machine learning (ML) initiatives. This role will involve working closely with business stakeholders to identify high-value AI/ML use cases and translating business requirements into technical solutions. The engineer will work to ensure the successful deployment of AI/ML solutions at scale, leveraging Azure services and Databricks tools. This position will be working hybrid out of our San Diego, CA – Summers Ridge, HQ office.

Requirements

  • This position is not currently eligible for visa sponsorship.
  • Bachelor’s degree in computer science, Engineering, or a related field (or equivalent experience).
  • 3+ years of experience in data engineering, with a strong focus on Databricks and AI/ML applications.
  • Proven experience working directly with business stakeholders to identify and implement AI/ML use cases.
  • Expertise in Apache Spark and hands-on experience with Databricks for building and optimizing data pipelines.
  • Strong programming skills in Python and Scala for data engineering and machine learning workflows in Databricks.
  • Experience with Azure Data Factory, Azure Data Lake, Azure Blob Storage, and Azure Synapse Analytics.
  • Proficiency with Databricks Delta Lake for data reliability and performance optimization.
  • Familiarity with MLflow and Databricks Runtime for Machine Learning for model management and deployment.
  • Knowledge of Azure DevOps for implementing CI/CD pipelines in Databricks-based projects.
  • Strong understanding of data governance, security practices, and compliance requirements in cloud environments.
  • Familiarity with emerging Databricks features such as Delta Live Tables and Unity Catalog.

Nice To Haves

  • Experience with real-time data processing using Apache Kafka or Azure Event Hubs.

Responsibilities

  • Work directly with business stakeholders to identify and define AI/ML use cases, translating business needs into technical requirements.
  • Design, develop, and optimize scalable data pipelines in Databricks for AI/ML applications, ensuring efficient data ingestion, transformation, and storage.
  • Build and manage Apache Spark-based data processing jobs in Databricks, ensuring performance optimization and resource efficiency.
  • Implement ETL/ELT processes and orchestrate workflows using Azure Data Factory, integrating various data sources such as Azure Data Lake, Blob Storage, and Microsoft Fabric.
  • Collaborate with Data Engineering teams to meet data infrastructure needs for model training, tuning, and deployment within Databricks and Azure Machine Learning.
  • Monitor, troubleshoot, and resolve issues within Databricks workflows, ensuring smooth operation and minimal downtime.
  • Implement best practices for data security, governance, and compliance within Databricks and Azure environments.
  • Automate data and machine learning workflows using CI/CD pipelines through Azure DevOps.
  • Maintain documentation of workflows, processes, and best practices to ensure knowledge sharing across teams.
  • Perform other work-related duties as assigned.

Benefits

  • QuidelOrtho offers a comprehensive benefits package including medical, dental, vision, life, and disability insurance, along with a 401(k) plan, employee assistance program, Employee Stock Purchase Plan, paid time off (including sick time), and paid Holidays. All benefits are non-contractual, and QuidelOrtho may amend, terminate, or enhance the benefits provided, as it deems appropriate.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service