About The Position

As an Azure Snowflake Data Engineer, you will play a key role in designing and delivering modern data platforms for our clients. This is a client-facing role suited to someone who thrives in consulting environments and enjoys solving complex business and technical challenges. Key Responsibilities Design and deliver scalable, secure, and cost-effective data platforms on Microsoft Azure Build and orchestrate robust data pipelines using Azure Data Factory (ADF) Develop, optimise, and manage enterprise-grade data models in Snowflake Implement transformation and analytics engineering frameworks using dbt Lead or contribute to large-scale data transformation programs Work closely with client stakeholders to gather requirements and translate business needs into technical solutions Establish best practices across data architecture, modelling, governance, and CI/CD Optimise performance, reliability, and cost in cloud environments Required Experience & Skills 5+ years’ experience in data engineering, ideally within consulting or professional services Demonstrated experience delivering large-scale enterprise data projects (multi-source integrations, high-volume datasets, complex transformations) Strong hands-on expertise in: Microsoft Azure data services (Data Lake, Synapse, Azure SQL, etc.) Azure Data Factory (ADF) Snowflake dbt Advanced SQL and strong data modelling skills (dimensional modelling, ELT patterns) Experience designing cloud-native architectures Strong understanding of DevOps, CI/CD, and infrastructure-as-code principles Experience implementing data quality, lineage, and governance frameworks Excellent stakeholder communication skills Highly Regarded Experience in Australian enterprise environments (banking & finance, retail, government) Microsoft Azure or Snowflake certifications Experience with Agile delivery methodologies Experience contributing to solution architecture and technical design documentation What We’re Looking For A consulting mindset with strong client engagement capability Proven ability to deliver in complex, multi-stakeholder environments Strong ownership and accountability across the full delivery lifecycle Ability to balance technical excellence with pragmatic business outcomes Passion for mentoring and uplifting engineering capability

Requirements

  • 5+ years’ experience in data engineering, ideally within consulting or professional services
  • Demonstrated experience delivering large-scale enterprise data projects (multi-source integrations, high-volume datasets, complex transformations)
  • Strong hands-on expertise in: Microsoft Azure data services (Data Lake, Synapse, Azure SQL, etc.)
  • Azure Data Factory (ADF)
  • Snowflake
  • dbt
  • Advanced SQL and strong data modelling skills (dimensional modelling, ELT patterns)
  • Experience designing cloud-native architectures
  • Strong understanding of DevOps, CI/CD, and infrastructure-as-code principles
  • Experience implementing data quality, lineage, and governance frameworks
  • Excellent stakeholder communication skills
  • A consulting mindset with strong client engagement capability
  • Proven ability to deliver in complex, multi-stakeholder environments
  • Strong ownership and accountability across the full delivery lifecycle
  • Ability to balance technical excellence with pragmatic business outcomes
  • Passion for mentoring and uplifting engineering capability

Nice To Haves

  • Experience in Australian enterprise environments (banking & finance, retail, government)
  • Microsoft Azure or Snowflake certifications
  • Experience with Agile delivery methodologies
  • Experience contributing to solution architecture and technical design documentation

Responsibilities

  • Design and deliver scalable, secure, and cost-effective data platforms on Microsoft Azure
  • Build and orchestrate robust data pipelines using Azure Data Factory (ADF)
  • Develop, optimise, and manage enterprise-grade data models in Snowflake
  • Implement transformation and analytics engineering frameworks using dbt
  • Lead or contribute to large-scale data transformation programs
  • Work closely with client stakeholders to gather requirements and translate business needs into technical solutions
  • Establish best practices across data architecture, modelling, governance, and CI/CD
  • Optimise performance, reliability, and cost in cloud environments
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service