Data Engineer

Utah Community Credit Union (UCCU)Provo, UT

About The Position

The Data Engineer will own the design, development, and optimization of the data infrastructure that underpins analytics, reporting, and machine learning initiatives across the credit union. Working closely with business intelligence analysts, compliance teams, and technology leadership, this role requires both technical depth and the ability to translate complex data challenges into reliable, scalable solutions within a regulated financial environment.

Requirements

  • Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related field (or equivalent professional experience)
  • 3–5 years of hands-on experience in data engineering, data infrastructure, or a closely related role
  • Advanced proficiency in SQL and experience with large-scale relational and columnar databases (e.g., PostgreSQL, Redshift, Snowflake, BigQuery)
  • Strong Python skills for data pipeline development and automation
  • Demonstrated experience designing and maintaining ETL/ELT workflows using tools such as dbt or Matillion
  • Working knowledge of at least one major cloud platform (Snowflake, AWS, Azure, or GCP) and associated data services
  • Solid understanding of data warehousing concepts including dimensional modeling and schema design
  • Experience implementing data quality monitoring, lineage tracking, and observability practices
  • Strong communication skills with the ability to work effectively across technical and non-technical audiences

Nice To Haves

  • Experience in a regulated financial services environment such as credit unions, banks, or fintech companies
  • Familiarity with credit union core banking platforms such as Symitar (Episys), Jack Henry, or FiServ
  • Knowledge of data governance frameworks, or master data management
  • Exposure to machine learning pipelines and feature engineering for predictive models
  • Experience with BI platforms such as Power BI, Tableau, or Looker
  • Relevant certifications such as AWS Certified Data Analytics, Snowflake SnowPro, or dbt Certified Developer
  • Familiarity with NCUA examination processes or financial regulatory reporting (e.g., HMDA, Call Report data)

Responsibilities

  • Architect, build, and maintain robust ETL/ELT pipelines to integrate data from core banking systems, lending platforms, digital banking channels, and third-party vendors
  • Design and manage the credit union’s data warehouse and data lake, including schema design, partitioning strategies, and performance optimization
  • Develop and enforce data quality frameworks, including automated testing, validation rules, and alerting for pipeline anomalies
  • Lead the implementation of data modeling best practices (dimensional modeling, data vault, or similar) to support scalable analytics
  • Collaborate with data analysts, compliance officers, and business stakeholders to define data requirements and deliver trusted data products
  • Manage orchestration and scheduling of data workflows using tools such as Matillion, Snowflake, or equivalent
  • Evaluate, implement, and maintain cloud data infrastructure on Snowflake or Azure, including compute, storage, and networking resources
  • Ensure all data processes comply with applicable regulations, including NCUA guidelines, BSA/AML requirements, and GLBA data privacy standards
  • Partner with the IT security team to enforce data access controls, encryption standards, and audit logging
  • Mentor junior data engineers and analysts, providing technical guidance and code reviews
  • Drive adoption of DataOps practices, including CI/CD for data pipelines, version control, and documentation standards
  • Support data migration efforts during platform transitions, core system upgrades, or mergers and acquisitions
  • Works a regular and predictable schedule.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service