Principal Engineer I - Senior Data Engineer

Western Alliance BankColumbus, OH
Onsite

About The Position

As the Principal Data Engineer, you are accountable to design, build, and implement critical components of the enterprise data platform in accordance with the strategic data and analytics needs of Western Alliance Bank Regulatory Reporting Program. This challenging role requires the best of your data management, cloud data engineering and technology knowledge to develop a cross-business unit data tier that is leveraged to implement and enhance the Bank’s regulatory reporting capabilities, including Large Financial Institution requirements. The Principal Data Engineer is a pioneer in our modern data platform working directly with data analysts, data engineers, enterprise architects, and business stakeholders. Embedded in the Enterprise Data & Analytics function, you influence the culture of data integration practices and are essential to mentoring other data professionals. Western Alliance Bancorporation is one of the country’s top-performing banking companies with more than $80 billion in assets. Through its primary subsidiary, Western Alliance Bank, clients benefit from a full spectrum of tailored commercial banking solutions and consumer products, all delivered with outstanding service by industry experts who put customers first.

Requirements

  • 8+ years’ experience with data engineering specifically in Extract, Transform and Load (ETL) concepts and processes, enterprise data warehouse capabilities, database principles, and other related tools and technologies; preferably strong to expert in Azure Data Factory, Azure Synapse Analytics and/or Databricks.
  • 5+ years’ experience designing, implementing, and supporting cloud data solutions; Azure Data Lake, Azure Data Factory, Azure Data Services, Azure Synapse, Azure Logic Apps, and Azure DevOps experience strongly preferred.
  • Bachelor’s degree in engineering or related field.
  • Expert level experience with at least one RDBMS and query language such as T-SQL, PL/SQL, Spark SQL.
  • Design and build reusable, dynamic PySpark components, including: Companion notebook frameworks, Shared helper functions and libraries, Parameter-driven ingestion and transformation patterns.
  • Support and influence the migration from Azure Synapse to Microsoft Fabric, including modernization of patterns and frameworks.
  • Expert level experience in conceptual, logical, and physical data design.
  • Certifications within Azure such as Azure Fundamentals, Azure Fabric Data Engineer, Azure Data Scientist, and/or Azure DevOps Engineer.
  • Experience with design tools as it relates to creating conceptual architecture diagrams and data flow diagrams such as Visio, Archimate, Lucidchart.
  • Excellent communication skills both verbal and written.

Nice To Haves

  • Familiarity with data science and analytics tools such as SAS, Tableau, PowerBI.
  • Experience in Agile, SAFe, and/or Scrum is preferred.
  • Experience integrating with data quality, data catalog and data lineage tools is preferred.
  • Additional Cloud Data certifications with any major cloud provides (Azure, AWS, GCP) is preferred.
  • Banking or financial services industry experience or other highly regulated industry experience is preferred.
  • Regulatory Reporting, including Large Financial Institution requirements, is preferred.

Responsibilities

  • Design, build, and implement critical components of the enterprise data platform in accordance with the strategic data and analytics needs of Western Alliance Bank Regulatory Reporting Program.
  • Develop a cross-business unit data tier that is leveraged to implement and enhance the Bank’s regulatory reporting capabilities, including Large Financial Institution requirements.
  • Implement data engineering solutions that satisfy the full data lifecycle including efficient technical hygiene and the use of Azure DevOps for the implementation of the Western Alliance Bank Regulatory Reporting and LFI strategy and roadmap.
  • Develop ETL and data pipeline capabilities considering how data is created, transformed, stored, archived, analyzed, and shared across Western Alliance Bank and our partner systems.
  • Expert-level PySpark development experience, including: Writing dynamic, parameterized, and reusable PySpark code, Building shared frameworks, helper functions, and companion notebook structures, Performance tuning and optimization of Spark workloads.
  • Implement Azure DevOps CI/CD pipelines for all data solutions in adherence to Western Alliance Bank technical standards.
  • Apply Test Driven Development methodology to all data solutions designed, built, and implemented.
  • Lead the implementation of outcomes, recommendations, and designs from Data Governance and Enterprise Architecture.
  • Perform data exploration and data profiling with tools such as SSAS.
  • Actively participate in making data architecture decisions.
  • Provide technical mentorship to team members and other data professionals.
  • Collaborate with product owners and business stakeholders to gain a working understanding of business requirements and operational processes.
  • Implement built-in quality and compliance- by-design in all data solutions.
  • Implement methods to include unstructured data and big data.

Benefits

  • Competitive salaries
  • An ownership stake in the company
  • Medical and dental insurance
  • Time off
  • A great 401k matching program
  • Tuition assistance program
  • An employee volunteer program
  • A wellness program
  • Opportunity to bolster your business knowledge, learning the ins and outs of how successful companies operate and manage their finances, giving you invaluable hands-on experience to help grow your career!
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service