Data/Information Architect

Circle K StoresTempe, AZ
81d

About The Position

We are seeking a highly skilled Data Architect with extensive experience in designing and implementing data solutions using Azure Databricks and related technologies. The ideal candidate will have a strong background in data engineering, architecture, and analytics solutions, with a focus on cloud-centric environments. This role requires a deep understanding of the data lifecycle, from ingestion to transformation and consumption, and the ability to deliver architectural initiatives that align with business strategies.

Requirements

  • 10+ years of experience in architecture, design, implementation, and analytics solutions.
  • 12+ years in the Data Engineering domain.
  • Experience in designing and implementing at least 2-3 projects end-to-end in Databricks.
  • 3+ years of experience in Databricks, including Delta Lake, dbConnect, and SQL Endpoint.
  • Strong coding skills in Python or Scala, preferably Python.
  • In-depth understanding of Spark Architecture and data modeling.
  • Experience with cloud services such as Azure, AWS, or GCP.
  • Strong SQL and Spark-SQL skills.
  • Experience in building and supporting mission-critical technology components.

Nice To Haves

  • Knowledge of Rest API.
  • Understanding of cost distribution.
  • Experience with migration projects to build a unified data platform.
  • Familiarity with DBT.
  • Experience with DevSecOps, Docker, and Kubernetes.
  • Knowledge of data ingestion technologies like Azure Data Factory and SSIS.
  • Experience with visualization tools such as Tableau and Power BI.

Responsibilities

  • Design and implement end-to-end architecture for a unified data platform covering all aspects of the data lifecycle.
  • Manage and mentor architecture talent within the respective sector.
  • Deliver architecture initiatives that demonstrate clear business efficiency.
  • Create architecture roadmaps and develop delivery blueprints for technology design.
  • Develop applications using Databricks and other cloud technologies.
  • Ensure compliance with data governance and security standards.
  • Optimize data pipelines for performance and cost efficiency.
  • Coordinate complex system dependencies and interactions.
  • Set best practices around Databricks CI/CD.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service