Senior Data Architect

AureonWest Des Moines, IA
Onsite

About The Position

We are seeking a Senior Data Architect to define and own the enterprise data architecture across analytical, operational, and integration platforms. This role involves designing logical, physical, and conceptual data models to support various use cases including analytics, reporting, AI/ML, and operational needs. The Senior Data Architect will establish and enforce data architecture standards, patterns, and best practices, and design scalable cloud-based and hybrid data platforms such as data lakes, lakehouses, and warehouses. A key aspect of this role is partnering with business, analytics, security, and application teams to translate requirements into data solutions, ensuring alignment with security, governance, privacy, and compliance requirements. The position also entails leading architecture reviews, providing technical guidance to data engineers and analytics teams, and evaluating/recommending data technologies, tools, and platforms. Additionally, the role will oversee data governance, metadata management, data quality, and lineage initiatives. On the data engineering side (approximately 30% of the role), responsibilities include designing and building robust data pipelines (batch and streaming) for ingestion, transformation, and delivery, developing and optimizing ETL/ELT processes using modern data engineering frameworks, and collaborating with engineers to implement architectural patterns in production systems. Ensuring pipelines are reliable, scalable, performant, and cost-efficient, troubleshooting data issues, and optimizing queries, storage, and processing are also critical. Support for CI/CD practices, automated testing, and monitoring for data workflows is expected.

Requirements

  • Bachelor's degree in computer science, Information Systems, Engineering, or equivalent experience
  • 8+ years of experience in data architecture and/or data engineering roles
  • Proven experience designing enterprise-scale Azure
  • Strong hands-on experience with: Azure Data Lake Storage Gen2, Azure Synapse Analytics, Azure Data Factory
  • Advanced SQL skills
  • Strong knowledge of data modeling, performance tuning, and cost optimization in Azure
  • Experience integrating data from SaaS, on-prem, and cloud-based systems
  • Ability to clearly communicate architectural decisions to both technical and non-technical stakeholders
  • Background with Data Governance Framework

Nice To Haves

  • Experience with Python (preferred)
  • Azure Databricks or Microsoft Fabric
  • Experience designing data architectures that support AI and machine learning workloads, including feature engineering, training, inference, and monitoring at scale.
  • Experience with Microsoft Fabric (OneLake, Lakehouse, Warehouse, Power BI integration)
  • Familiarity with Power BI semantic models and analytics consumption patterns
  • Knowledge of Azure Purview / Microsoft Purview for data governance and lineage
  • Experience with event-driven and streaming architectures in Azure
  • Understanding of DataOps / DevOps practices in an Azure environment
  • Experience supporting regulated or security-conscious enterprise environments

Responsibilities

  • Define and own enterprise data architecture across analytical, operational, and integration platforms
  • Design logical, physical, and conceptual data models to support analytics, reporting, AI/ML, and operational use cases
  • Establish and enforce data architecture standards, patterns, and best practices
  • Design scalable cloud-based and hybrid data platforms (e.g., data lakes, lakehouses, warehouses)
  • Partner with business, analytics, security, and application teams to translate requirements into data solutions
  • Ensure data solutions align with security, governance, privacy, and compliance requirements
  • Lead architecture reviews and provide technical guidance to data engineers and analytics teams
  • Evaluate and recommend data technologies, tools, and platforms
  • Ensure data governance, metadata management, data quality, and lineage initiatives
  • Design and build robust data pipelines (batch and streaming) for ingestion, transformation, and delivery
  • Develop and optimize ETL/ELT processes using modern data engineering frameworks
  • Collaborate with engineers to implement architectural patterns in production systems
  • Ensure pipelines are reliable, scalable, performant, and cost-efficient
  • Troubleshoot data issues and optimize queries, storage, and processing
  • Support CI/CD practices, automated testing, and monitoring for data workflows
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service