Principal Data Utility

BHP Career Portal
11dHybrid

About The Position

The Principal Data Quality leads the uplift of data engineering capabilities across BHP, ensuring scalable design patterns, consistent practices, and high‑quality data solutions that support reliable, enterprise-wide decision‑making. As a Principal Data Quality, you will play a pivotal role in shaping the future of data engineering across our global organization. This dual role blends deep technical capability with strategic architectural leadership. You will partner closely with regional Data Utility teams around the world to remove technical blockers, uplift engineering practices, and drive consistency in design patterns, frameworks, and reusable components. You will help build and foster a global community of Data Engineers—promoting knowledge sharing, collaboration, and alignment across teams.

Requirements

  • Experience working across distributed processing, traditional RDBMS, MPP and NoSQL database technologies.
  • Strong background with ETL and data warehousing tools such as Informatica, Talend, Pentaho or DataStage.
  • Hands‑on experience with Hadoop, Spark, Storm, Impala and related platforms.
  • Strong understanding of RDBMS concepts, ETL principles and end‑to‑end data pipeline development.
  • Solid knowledge of data modelling techniques (ERDs, star schema, snowflake schema).
  • Experience with AWS services including S3, EC2, EMR, RDS, Redshift and Kinesis.
  • Exposure to distributed processing (Spark, Hadoop, EMR), RDBMS (SQL Server, Oracle, MySQL, PostgreSQL), MPP (Redshift, Teradata) and NoSQL technologies (MongoDB, DynamoDB, Cassandra, Neo4J, Titan).
  • Experience designing and building streaming pipelines using tools such as Kafka, Kafka Streams or Spark Streaming.
  • Strong proficiency in Python and at least two of: Scala, SQL or Java.
  • Experience deploying production applications, including testing, packaging, monitoring and release management.
  • Proficiency with Git‑based source control and CI/CD pipelines, ideally GitLab.
  • Strong engineering discipline including code reviews, testing frameworks and maintainable coding practices.
  • Master’s degree in Computer Science, MIS, Engineering or a related field.
  • At least 10 years’ experience in Data Engineering or Architecture.
  • Experience working within DevOps, Agile, Scrum or Continuous Delivery environments.
  • Ability to mentor team members and support capability development across teams.
  • Strong communication, listening and influencing skills.
  • High levels of motivation, adaptability and problem‑solving capability.

Nice To Haves

  • Experience with structured, semi‑structured and unstructured data.
  • Understanding of data governance, lineage and data quality approaches.
  • Experience with Infrastructure‑as‑Code tools such as Terraform.
  • Exposure to workflow orchestration tools like Azkaban, Luigi or Airflow.
  • Experience enabling data consumption through APIs, event streams or data marts.
  • Experience with MuleSoft, Solace or StreamSets.

Responsibilities

  • Interface with Data Utility teams across the globe to foster a close‑knit technical forum, enabling teams to share knowledge, designs, and code, and providing hands‑on support when needed.
  • Contribute to a global technical forum to identify pain points, gaps, and technology issues—and drive alignment on shared solutions.
  • Shape the strategic direction for data engineering across BHP through future‑focused architectural guidance.
  • Work closely with internal customers to understand their data requirements, model data structures, and design and implement scalable ingestion pipelines from operational and enterprise systems.
  • Lead the design and development of integration solutions and ETL pipelines, ensuring high‑quality documentation and approval of engineering patterns.
  • Collaborate with on‑prem and cloud platform teams to identify capability gaps and evaluate emerging tools and technologies.
  • Work with the Enterprise & Global (E&G) Data Utility team to enhance and evolve the E&G data platform to meet customer needs.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service