Data Architect (Azure, Lakehouse )

Exadel Inc (Website)Town of Poland, NY
Hybrid

About The Position

Exadel is seeking a Data Architect with expertise in Azure and Lakehouse technologies for a 4-month engagement, with the possibility of extension. This role involves auditing and redesigning a Databricks environment, assessing medallion architecture and ADF pipeline design, and leading the engineering delivery of remaining scope. The Data Architect will be responsible for technical design, documentation, and ensuring architectural decisions are clear and explainable to stakeholders. This is a client-facing role requiring strong communication and problem-solving skills, particularly in a project recovery context.

Requirements

  • Expert-level proficiency in Azure Data Factory: pipeline design, linked services, integration runtimes (including Self-Hosted IR), incremental load patterns, and CDC
  • Strong command of Azure Data Lake Storage Gen2, Unity Catalog, and Azure networking fundamentals relevant to data platform connectivity (VNets, Private Endpoints)
  • Hands-on experience with Azure DevOps for CI/CD pipeline construction covering Databricks and ADF deployment across Dev / Test / Prod environments
  • Deep, hands-on Databricks experience — this is not a theoretical understanding; you must have delivered real solutions on the platform
  • Expertise in Databricks SQL, Delta Lake, and the medallion architecture (Bronze / Silver / Gold)
  • Ability to design and implement performant Databricks SQL views at scale, including complex join patterns, cross-database references, and performance validation
  • Experience with Unity Catalog configuration and RBAC
  • Strong T-SQL / SQL Server background — you will be porting SQL Server views and need to understand source system data structures across OLTP and analytical databases
  • Proven ability to design and implement fact/dimension tables, staging tables, and analytic models in a lakehouse context
  • Experience building and running reconciliation frameworks to validate migrated data against source systems
  • Solid Python skills for pipeline logic, test scripting, and utility development
  • Demonstrated experience designing greenfield modern data platforms — you must be able to evidence solutions you have architected from scratch
  • Demonstrated experience leading modernisation or transformation of brownfield data environments — migrating legacy reporting layers, rationalising fragmented data estates, or re-platforming to cloud-native architectures
  • Ability to produce clear, concise architecture artefacts: solution design documents, data flow diagrams, entity models, and deployment architecture
  • Sound understanding of data modelling — conceptual, logical, and physical — in the context of Lakehouse and dimensional design
  • Proven ability to operate in a consulting or client-facing environment: credible, articulate, and capable of managing expectations under pressure
  • Experience producing and owning delivery plans — not just contributing to them
  • Comfortable working in a recovery or turnaround context, where rapid assessment and honest communication are more important than defending prior decisions
  • Experience working under an engagement lead / principal and operating with a high degree of autonomy within that structure
  • English level Upper-Intermediate

Nice To Haves

  • Familiarity with Microsoft Purview for data governance and lineage (desirable)
  • Familiarity with Databricks Lakeflow (desirable)

Responsibilities

  • Rapidly review and understand the original Statement of Work, change requests, and any pre-existing contractual dependencies
  • Audit the current state of the Databricks environment: what has been built, its quality, and its alignment to the agreed design
  • Identify root causes for delivery gaps — distinguishing between client-side blockers, scope ambiguity, and engineering shortfalls
  • Assess the medallion architecture (Bronze / Silver / Gold) and the ADF pipeline design against best practice and the client's requirements
  • Review all outstanding work items (including dependent database views, gold views, ingestion, test framework, and UAT artefacts) and produce a clear, honest picture of the remaining effort
  • Validate and, where necessary, correct or redesign the existing solution architecture to ensure it is fit for purpose, scalable, and maintainable
  • Own the technical design for all remaining deliverables: ADF pipelines, Databricks SQL views, medallion layers, Unity Catalog, and CI/CD deployment via Azure DevOps
  • Ensure architectural decisions are documented clearly and are explainable to both technical and non-technical client stakeholders
  • Define the reconciliation and testing framework for view validation against SQL Server source systems
  • Lead and execute the engineering delivery of all remaining scope, including: porting SQL Server views to Databricks SQL, building and validating ADF pipelines (batch ingestion), implementing gold-layer dimensional models and analytic views, deploying via Azure DevOps CI/CD pipelines (Test and Production environments)
  • Produce high-quality, well-documented, maintainable code — not quick fixes
  • Execute or oversee the reconciliation framework to prove view correctness against source systems
  • Manage dependencies on the client (CDC enablement, SQL account provisioning, UAT availability) and escalate blockers promptly to the Engagement Lead
  • Produce a realistic, structured delivery plan covering all remaining scope, with clear milestones, dependencies, and acceptance criteria
  • Support the Engagement Lead in presenting the recovery plan to the client for sign-off
  • Attend client-facing working sessions as the technical authority — you must be credible and confident communicating with both IT and commercial stakeholders
  • Support the delivery management function with regular, accurate progress reporting

Benefits

  • International projects
  • In-office, hybrid, or remote flexibility
  • Medical healthcare
  • Recognition program
  • Ongoing learning & reimbursement
  • Well-being program
  • Team events & local benefits
  • Sports compensation
  • Referral bonuses
  • Top-tier equipment provision
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service