Databricks Data Engineer

Datavail InfotechBoulder, CO

About The Position

As a Databricks Data Engineer, you will work directly with clients across multiple industries to design, implement, and optimize Databricks-based data solutions. You will be a key member of our Professional Services delivery teams, delivering high-quality projects on time and within scope while building strong client relationships. This is a client-facing role that combines hands-on technical delivery with consulting best practices.

Requirements

  • 3 -5 years of hands-on Databricks experience (or strong Spark experience with significant recent Databricks work)
  • Proven experience delivering Databricks projects in a consulting or professional services environment (preferred) or equivalent client-facing project delivery
  • Strong proficiency in PySpark, Spark SQL, Python, and SQL
  • Deep experience with Delta Lake, Unity Catalog, Delta Live Tables, and Databricks Jobs
  • Hands-on experience with Git version control, pull requests, code reviews, and collaborative development workflows
  • Cloud platform experience (Azure Databricks, AWS, or GCP - at least one)
  • Excellent client-facing and communication skills - able to explain complex concepts to both technical and non-technical audiences
  • Solid understanding of data governance, security, and Lakehouse best practices
  • Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience)

Nice To Haves

  • Databricks certifications (Data Engineer Associate / Professional, Lakehouse, etc.)
  • Experience with dbt, Airflow, Terraform, Databricks Asset Bundles (DABs), or MLflow
  • Background in specific industries (Financial Services, Healthcare, Retail, Manufacturing, etc.)
  • Experience with large-scale data migrations or legacy system modernization
  • Knowledge of streaming (Spark Structured Streaming / Kafka) and real-time analytics

Responsibilities

  • Lead and contribute to end-to-end Databricks implementations for clients, including data migration, Lakehouse architecture, and pipeline development.
  • Gather technical requirements, design solutions, and present recommendations to client stakeholders (technical and business).
  • Build scalable ETL/ELT pipelines using PySpark, Delta Lake, Delta Live Tables (DLT), and Databricks Workflows.
  • Design and implement Databricks Genie.
  • Design and implement semantic layers.
  • Use Databricks AI features to accelerate development, debugging, and code optimization.
  • Design and implement Lakebase architectures for operational and analytical workloads, including transactional data use cases.
  • Develop solutions using SDLC best practices, including modular code design, testing, and documentation.
  • Use Git based version control with proper branching strategies.
  • Implement CI/CD pipelines for Databricks asset.
  • Implement data quality checks, validations, and expectations within workflows.
  • Design and implement Unity Catalog governance, security, and lineage solutions.
  • Optimize Databricks workloads for performance, cost, and reliability (Photon, cluster policies, Liquid Clustering, Auto Loader, etc.).
  • Integrate Databricks with client ecosystems (Azure, AWS, GCP, Snowflake, Kafka, legacy systems, etc.).
  • Support client workshops, proof-of-concepts (POCs), and knowledge transfer sessions.
  • Collaborate with client data teams to ensure successful adoption and handover of solutions.
  • Deliver projects following consulting methodologies while meeting quality, timeline, and budget expectations.
  • Document architectures, runbooks, and best practices for client use.
  • Participate in solutioning activities (scoping, estimation, technical demos) as needed.

Benefits

  • Datavail is one of the largest data focused services company in North America and provides both professional and managed services and expertise in Database Management, Application Development and Management, Cloud & Infrastructure Management, Packaged Applications and BI/Analytics
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service