About The Position

Why Valvoline Global Operations? At Valvoline Global Operations, we're proud to be The Original Motor Oil, but we've never rested on being first. Founded in 1866, we introduced the world's first branded motor oil, staking our claim as a pioneer in the automotive and industrial solutions industry. Today, as an affiliate of Aramco, one of the world's largest integrated energy and chemicals companies, we are driven by innovation and committed to creating sustainable solutions for a better future. With a global presence, we develop future-ready products and provide best-in-class services for our partners around the world. For us, originality isn't just about where we began; it's about where we're headed and how we'll lead the way. We are originality in motion. Our corporate values—Care, Integrity, Passion, Unity, and Excellence—are at the heart of everything we do. These values define how we operate, how we treat one another, and how we engage with our partners, customers, and the communities we serve. At Valvoline Global, we are united in our commitment to: • Treating everyone with care.• Acting with unwavering integrity.• Striving for excellence in all endeavors.• Delivering on our commitments with passion.• Collaborating as one unified team. When you join Valvoline Global, you'll become part of a culture that celebrates creativity, innovation, and excellence. Together, we're shaping the future of automotive and industrial solutions Job Purpose We are seeking an experienced and forward-thinking Senior Manager to lead our Data Engineering and MLOps functions within Valvoline Global. This role will guide a team responsible for designing, building, and operationalizing the enterprise data platform that powers analytics, reporting, and machine learning across the organization. The ideal candidate has deep experience in Databricks for data engineering, data governance, and MLOps, with the demonstrated ability to leverage SAP systems such as SAP BW/4HANA, SAP S/4HANA, and SAP Datasphere. This leader will play a critical role in delivering trusted, high-quality data and ensuring our platforms are scalable, reliable, and aligned to business needs.

Requirements

  • Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, Data Engineering, or a related field.
  • 8-12+ years of experience in data engineering, data platforms, or analytics technology roles.
  • 3-5+ years of leadership experience managing technical teams or driving enterprise-scale data initiatives.
  • Strong expertise in Databricks, including Spark/PySpark, Delta Lake, Unity Catalog, MLflow, and workflow orchestration.
  • Strong understanding of data engineering principles, ELT/ETL design, modern data warehousing, and cloud data architectures.
  • Proven ability to quickly learn and adopt SAP data technologies (SAP BW/4HANA, SAP S/4HANA, SAP Datasphere).
  • Experience with cloud platforms such as Azure, AWS, or GCP, including data storage, compute, orchestration, and DevOps tooling.
  • Proficiency in SQL and at least one programming language (Python preferred).
  • Experience implementing CI/CD and DevOps practices for data and machine learning pipelines.
  • Strong communication, stakeholder management, and cross-functional leadership skills.

Nice To Haves

  • Hands-on experience with Databricks lakehouse architecture, including Delta Lake, Unity Catalog, MLflow, and advanced Spark optimization.
  • Practical experience working with SAP Datasphere or other SAP data environments for modeling and integration scenarios.
  • Familiarity with SAP BW or SAP HANA data structures and extractors.
  • Understanding of data governance frameworks, access management, and data security best practices.
  • Experience supporting or enabling data science teams through feature engineering, model deployment, and MLOps tooling.
  • Exposure to real-time or streaming data integration frameworks.
  • Certifications in Databricks, SAP, or major cloud platforms are a plus.
  • Experience with BI tools such as Power BI or Tableau.

Responsibilities

  • Lead, mentor, and develop a high-performing Data Engineering and MLOps team that supports enterprise data and analytics initiatives.
  • Design, build, and optimize robust, scalable, and high-performance data pipelines and data products using Databricks (PySpark, Delta Lake, Unity Catalog).
  • Oversee integration of SAP systems—including SAP BW/4HANA, SAP S/4HANA, and SAP Datasphere—into cloud data platforms and analytics environments.
  • Partner with data architects, analysts, data scientists, and business stakeholders to gather requirements and deliver data solutions that support strategic objectives.
  • Establish and manage MLOps frameworks for automated model training, deployment, monitoring, and lifecycle management using Databricks and MLflow.
  • Ensure and enforce data quality, metadata management, lineage tracking, and governance standards across the data ecosystem.
  • Implement monitoring, observability, and incident management processes to maintain platform reliability, performance, and uptime.
  • Evaluate emerging tools, technologies, and best practices in modern data engineering, SAP integration, cloud architecture, and machine learning operations.
  • Create documentation, reusable frameworks, and standards to ensure consistency, transparency, and scalability across all data processes.
  • Collaborate with security, cloud, SAP, and enterprise architecture teams to ensure compliance, safety, and alignment with corporate technology strategy.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service