AI/ML Data Engineer (Databricks)

QuidelOrthoSan Diego, CA
1d$85,000 - $105,000Remote

About The Position

The Opportunity QuidelOrtho unites the strengths of Quidel Corporation and Ortho Clinical Diagnostics, creating a world-leading in vitro diagnostics company with award-winning expertise in immunoassay and molecular testing, clinical chemistry and transfusion medicine. We are more than 6,000 strong and do business in over 130 countries, providing answers with fast, accurate and consistent testing where and when they are needed most – home to hospital, lab to clinic. Our culture puts our team members first and prioritizes actions that support happiness, inspiration and engagement. We strive to build meaningful connections with each other as we believe that employee happiness and business success are linked. Join us in our mission to transform the power of diagnostics into a healthier future for all. The Role We are seeking an AI/ML Data Engineer to support our Global Data and Analytics team. The AI/ML Data Engineer will be responsible for designing, building, and optimizing data pipelines and infrastructure using Databricks to support AI and machine learning (ML) initiatives. This role will involve working closely with business stakeholders to identify high-value AI/ML use cases and translating business requirements into technical solutions. The engineer will work to ensure the successful deployment of AI/ML solutions at scale, leveraging Azure services and Databricks tools. This position is remote eligible, with a strong preference for candidates based in San Diego and able to maintain office presence.

Requirements

  • Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience).
  • Minimum 3 years of experience in data engineering, with a strong focus on Databricks and AI/ML applications.
  • Proven experience working directly with business stakeholders to identify and implement AI/ML use cases.
  • Expertise in Apache Spark and hands-on experience with Databricks for building and optimizing data pipelines.
  • Strong programming skills in Python and Scala for data engineering and machine learning workflows in Databricks.
  • Experience with Azure Data Factory, Azure Data Lake, Azure Blob Storage, and Azure Synapse Analytics.
  • Proficiency with Databricks Delta Lake for data reliability and performance optimization.
  • Familiarity with MLflow and Databricks Runtime for Machine Learning for model management and deployment.
  • Knowledge of Azure DevOps for implementing CI/CD pipelines in Databricks-based projects.
  • Strong understanding of data governance, security practices, and compliance requirements in cloud environments.
  • Familiarity with emerging Databricks features such as Delta Live Tables and Unity Catalog.
  • Ability to travel up to 5-10%.
  • This position is not currently eligible for visa sponsorship.

Nice To Haves

  • Experience with real-time data processing using Apache Kafka or Azure Event Hubs.
  • Master's degree in Computer Science or related technical fields.

Responsibilities

  • Work directly with business stakeholders to identify and define AI/ML use cases, translating business needs into technical requirements.
  • Design, develop, and optimize scalable data pipelines in Databricks for AI/ML applications, ensuring efficient data ingestion, transformation, and storage.
  • Build and manage Apache Spark-based data processing jobs in Databricks, ensuring performance optimization and resource efficiency.
  • Implement ETL/ELT processes and orchestrate workflows using Azure Data Factory, integrating various data sources such as Azure Data Lake, Blob Storage, and Microsoft Fabric.
  • Collaborate with Data Engineering teams to meet data infrastructure needs for model training, tuning, and deployment within Databricks and Azure Machine Learning.
  • Monitor, troubleshoot, and resolve issues within Databricks workflows, ensuring smooth operation and minimal downtime.
  • Implement best practices for data security, governance, and compliance within Databricks and Azure environments.
  • Automate data and machine learning workflows using CI/CD pipelines through Azure DevOps.
  • Maintain documentation of workflows, processes, and best practices to ensure knowledge sharing across teams.
  • Perform other work-related duties as assigned.

Benefits

  • QuidelOrtho offers a comprehensive benefits package including medical, dental, vision, life, and disability insurance, along with a 401(k) plan, employee assistance program, Employee Stock Purchase Plan, paid time off (including sick time), and paid Holidays.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service