Senior Data Engineer

Valvoline Global

About The Position

At Valvoline Global Operations, we’re proud to be The Original Motor Oil, but we’ve never rested on being first. Founded in 1866, we introduced the world’s first branded motor oil, staking our claim as a pioneer in the automotive and industrial solutions industry. Today, as an affiliate of Aramco, one of the world’s largest integrated energy and chemicals companies, we are driven by innovation and committed to creating sustainable solutions for a better future. With a global presence, we develop future-ready products and provide best-in-class services for our partners around the world. For us, originality isn’t just about where we began; it’s about where we’re headed and how we’ll lead the way. We are originality in motion. Our corporate values—Care, Integrity, Passion, Unity, and Excellence—are at the heart of everything we do. These values define how we operate, how we treat one another, and how we engage with our partners, customers, and the communities we serve. At Valvoline Global, we are united in our commitment to: • Treating everyone with care.• Acting with unwavering integrity.• Striving for excellence in all endeavors.• Delivering on our commitments with passion.• Collaborating as one unified team. When you join Valvoline Global, you’ll become part of a culture that celebrates creativity, innovation, and excellence. Together, we’re shaping the future of automotive and industrial solutions. Job Purpose We are seeking a highly skilled and motivated Data Engineer to join our growing data and analytics team. The ideal candidate will have strong experience designing and developing scalable data pipelines, integrating complex systems, and optimizing data workflows. Proficiency in Databricks and SAP Datasphere is preferred, as these platforms are central to our data ecosystem. However, demonstration of aptitude to quickly adapt to new technologies from past experience is also highly valued. This role will play a critical part in ensuring the accessibility, reliability, and performance of our enterprise data infrastructure to enable impactful business intelligence and data science initiatives.

Requirements

  • Bachelor's or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related field.
  • 5-7+ years of experience in a data engineering or related role.
  • Strong knowledge of data engineering principles, data warehousing concepts, and modern data architecture.
  • Proficiency in SQL and at least one programming language (e.g., Python, Scala).
  • Experience with cloud platforms (e.g., Azure, AWS, or GCP), particularly in data services.
  • Familiarity with data orchestration tools (e.g., PySpark, Airflow, Azure Data Factory) and CI/CD pipelines.

Nice To Haves

  • Hands-on experience with Databricks (including Spark/PySpark, Delta Lake, MLflow, Unity Catalog, etc.).
  • Practical experience working with SAP Datasphere (or SAP Data Warehouse Cloud) in data modeling and data integration scenarios.
  • SAP BW or SAP HANA experience is a plus.
  • Experience with BI tools like Power BI or Tableau.
  • Understanding of data governance frameworks and data security best practices.
  • Exposure to data lakehouse architecture and real-time streaming data pipelines.
  • Certifications in Databricks, SAP, or cloud platforms are advantageous.

Responsibilities

  • Design, build, and maintain robust, scalable, and high-performance data pipelines using Databricks and SAP Datasphere.
  • Collaborate with data architects, analysts, data scientists, and business stakeholders to gather requirements and deliver data solutions aligned with stakeholders’ goals.
  • Integrate diverse data sources (e.g., SAP, APIs, flat files, cloud storage) into the enterprise data platforms
  • Ensure high standards of data quality and implement data governance practices. Stay current with emerging trends and technologies in cloud computing, big data, and data engineering.
  • Provide ongoing support for the platform, troubleshoot any issues that arise, and ensure high availability and reliability of data infrastructure.
  • Create documentation for the platform infrastructure and processes, and train other team members or users in platform effectively.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service