About The Position

In this role, you will design, build, and optimize enterprise-scale data pipelines on Azure Databricks, supporting structured, semi-structured, and unstructured data. You will work closely with data architects, security teams, and business stakeholders to implement best practices for data governance, security, and high-performance data processing. This position involves hands-on development of Delta Lake architectures, CI/CD pipelines, and orchestrated workflows to deliver reliable, scalable data products. Operating in a fully remote, collaborative environment, you will have the opportunity to influence the overall data platform strategy while solving complex technical challenges. Your work will enable analytics, reporting, and AI initiatives across the enterprise. The role combines technical depth, architectural insight, and operational excellence, offering strong growth potential in cloud data engineering.

Requirements

  • Minimum 5 years of professional experience delivering Azure Databricks solutions in enterprise environments.
  • Strong expertise in Databricks components: Workspaces, Notebooks, Jobs, Workflows, Repos, Unity Catalog, Delta Lake, Delta Live Tables, and MLflow.
  • Solid knowledge of Azure Data Platform services: ADLS Gen2, Azure Key Vault, Azure Monitor, Azure Log Analytics, Azure Entra ID/RBAC, Terraform provider a plus.
  • Experience implementing data security and governance frameworks including access controls, masking, row-level security, ABAC, governed tags, credential management, lineage, and auditability.
  • Proficiency in Python, SQL, PySpark, Git, Spark performance tuning, and distributed computing concepts.
  • Familiarity with AI/ML lifecycle and MLflow model management.
  • Experience working in Agile or DevOps-oriented teams, with strong analytical, problem-solving, and communication skills.
  • Fluency in Portuguese and English.

Responsibilities

  • Design, develop, and optimize ETL/ELT data pipelines using Azure Databricks (Python, PySpark, SQL, Delta Lake).
  • Configure and maintain Databricks workspaces, clusters, jobs, repositories, and workflow schedules for multi-team data product delivery.
  • Implement and enforce data governance and security best practices, including access controls, lineage, and auditing frameworks.
  • Build and maintain Delta Lake architectures with medallion (bronze/silver/gold) layer structures.
  • Integrate Databricks pipelines with Azure Data Platform services, ensuring reliable orchestration, observability, and CI/CD automation.
  • Collaborate with data architects, data owners, and cross-functional teams to align platform solutions with enterprise standards.
  • Optimize pipeline performance, compute cost, and system efficiency through code-level and cluster-level tuning strategies.

Benefits

  • Competitive compensation aligned with experience.
  • Fully remote work environment.
  • Delivery of work equipment suited to the role and responsibilities.
  • Comprehensive benefits plan.
  • Opportunity to work with expert teams on high-impact, large-scale projects.
  • Exposure to long-term, strategic client initiatives in diverse industries.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service