About The Position

The Technical Data Architect is expected to lead from the front — setting direction, making hard trade-offs, and validating architectural designs through hands-on implementation where it matters most. This role defines the data platform architecture and personally implements selected foundational components (e.g., core ingestion patterns, Medallion layer contracts, CI/CD templates) to prove architectural decisions in practice, while collaborating closely with data engineers to guide and scale implementation across the platform.

Requirements

  • BS or BA in Computer Science, Software Engineering, or demonstrated equivalent experience.
  • 10+ years of experience in data architecture and engineering within high-tech or SaaS environments, with demonstrated ownership of enterprise-scale data platforms and solutions in production.
  • Strong hands-on expertise in modern cloud data architecture and engineering, including direct implementation of core platform components, leveraging the Databricks Lakehouse Platform (Delta Lake, Unity Catalog, Delta Live Tables, Databricks SQL) and Google Cloud Platform (BigQuery, Dataflow, Dataproc, Pub/Sub, Composer) to build and optimize scalable, governed, and high-performance data pipelines and models.
  • Proficient in modern data engineering practices, including Medallion design, CI/CD automation (GitHub, Cloud Build), metadata and lineage management, data modeling and governance, and both streaming and batch data processing.
  • Familiarity with AI/ML and analytics enablement, ensuring data readiness for model training, feature engineering, and intelligent insights.
  • Strong understanding of data governance, data quality, and compliance frameworks, with experience implementing standards through cataloging and lineage tools.
  • Experience deploying and operating cloud infrastructure, with the ability to collaborate effectively with Cloud and Infrastructure Architects to align design, security, and performance requirements.
  • Strong written, verbal, and interpersonal communication skills.
  • Highly adaptable, agile, and resilient in managing multiple concurrent projects; proactively seeks feedback to ensure alignment and continuous improvement.

Nice To Haves

  • Certifications such as Google Cloud Professional Data Engineer or Cloud Architect, Databricks Certified Data Engineer (Professional), dbt, Looker, Terraform, FinOps, or Cloud Security are good to have.

Responsibilities

  • Define the modern cloud-based data architecture (on Databricks Lakehouse and GCP/Azure) with cross-functional teams, owning key architectural decisions and validating them in production.
  • Architect and operationalize Medallion (Bronze–Silver–Gold) data models ensuring governance, data quality, and reusability across analytical and ML workloads.
  • Combine data engineering and architecture expertise to design, develop, and deploy modern data warehouse and Lakehouse solutions, driving initiatives in data preparation, integration, exploration, and modeling.
  • Validate architectural decisions by personally implementing selected foundational components (e.g., ingestion frameworks, Medallion layer contracts, CI/CD templates).
  • Participate in Architecture Review Boards, interface with other Cloud Architects and act as the technical liaison for the department.
  • Design, develop, and deliver complete data solutions, combining strong engineering execution with architectural design, managing pipelines, models, and workflows from ingestion through presentation, ensuring performance and cost efficiency.
  • Establish best practices and consistency across various Data on Cloud solutions.
  • Design and build highly reusable and scalable cloud-native data ecosystems
  • Assist in the development of comprehensive and strategic business cases used at management and executive levels for funding and scoping decisions on data solutions
  • Understand data technology trends and the practical application of existing, new, and emerging technologies to enable new and evolving business needs.
  • Partner closely with leadership and business stakeholders as a trusted and influential evangelist to identify important questions, define key metrics, cultivate a data driven and AI-ready culture.
  • Design and implement data security measures to protect sensitive information.
  • Work collaboratively with Data and Cloud Governance specialists to align on rules, processes, and standards, implementing strong governance through Unity Catalog, metadata management, and lineage tracking, while maintaining clear documentation of data architecture, flows, and dependencies.
  • Design the lifecycle management process to seamlessly handle Python package and API dependency version changes

Benefits

  • Flexible vacation and Kinaxis Days (company-wide day off on the last Friday of every month)
  • Flexible work options
  • Physical and mental well-being programs
  • Regularly scheduled virtual fitness classes
  • Mentorship programs and training and career development
  • Recognition programs and referral rewards
  • Hackathons
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service