Databricks Architect | US

Cuesta Partners
$145,000 - $188,000Remote

About The Position

We are looking for a Databricks Architect to lead the delivery of data engineering solutions in a client-facing environment. This is a hands-on leadership role where you will work directly with clients — including C-level stakeholders — to design, build, and deploy production-grade Databricks platforms.

Requirements

  • 6+ years of experience in Data Engineering, Cloud Architecture, or related roles.
  • 3+ years of hands-on experience with Databricks.
  • Strong proficiency in PySpark, Spark SQL, Python, and SQL.
  • Deep experience with Delta Lake, Unity Catalog, Delta Live Tables, and Databricks Jobs.
  • Proven experience delivering Databricks projects in a consulting or professional services environment.
  • Strong understanding of data lake concepts and formats (Delta, Iceberg, Parquet, etc.).
  • Experience designing and managing Unity Catalog for governance, access control, and lineage.
  • Familiarity with medallion architecture (bronze/silver/gold) patterns.
  • Hands-on experience with Git version control, pull requests, code reviews, and collaborative development workflows.
  • CI/CD and infrastructure-as-code for Databricks resources (Terraform, Asset Bundles, GitHub Actions).
  • MLflow experience: experiment tracking, model registry, deployment, and observability.
  • Cloud platform experience.
  • Experience working in client-facing, consulting, or cross-functional environments.
  • Experience leading technical projects or mentoring engineering teams.

Nice To Haves

  • Databricks certifications (Data Engineer Associate or Professional, Machine Learning Associate or Professional, Data Engineer Associate or Professional or ML Engineer Associate or Professional).
  • Experience with agentic AI workflows, including agents, skills, and tools on the Databricks side (Agentbricks), as well as LangChain and Claude agentic workflows.

Responsibilities

  • Design and architect scalable Databricks-based solutions for enterprise clients.
  • Lead end-to-end cloud data platform implementations across AWS and/or Azure.
  • Design and build production pipelines using Databricks Workflows and Spark Declarative Pipelines.
  • Design scalable ETL/ELT pipelines using Spark, PySpark, and Databricks Workflows.
  • Partner directly with client stakeholders to understand business needs and translate them into technical solutions.
  • Provide technical leadership and guidance to data engineering teams.
  • Implement CI/CD pipelines and infrastructure-as-code for Databricks resources.
  • Drive collaborative development through Git workflows, pull requests, and code reviews.
  • Collaborate cross-functionally with analytics, engineering, and business teams.
  • Serve as the primary technical point of contact for clients, including C-level executives.

Benefits

  • Constant opportunities for exposure & learning
  • Flexible working location and enabling personal-life harmony with work
  • Agency and influence in the company’s total strategy and direction
  • Collaboration with a high-performing team
  • Competitive base salary and target bonus of 20-25%
  • 401k, healthcare benefits, paid time-off, and more!
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service