Senior Data Engineer (AWS, FiveTran, dbt, Snowflake)

BOK FinancialRichardson, TX
Onsite

About The Position

The Data Solutions Engineer IV works in close collaboration to design, develop, implement, and support various types of data solutions across all lines of BOKF Financial. This position requires close interaction, influencing and collaboration with other engineers, architects, analytics partners, vendor partners and functional leaders. The Data Solutions Engineer IV defines enterprise-wide best practices, methodologies, governance, and standards and serves as a data solutions subject matter expert. This position mentors and coaches lower-level Data Solutions Engineers and leads meetings as needed.

Requirements

  • Bachelor’s Degree in a data-centric field (Computer Science, Economics, Information Systems, Data Analytics, etc.) and 7+ years’ experience with demonstrated track record of successful technical leadership in the execution of large-scale data projects or equivalent combination of education and experience.
  • Proven experience in successfully building, automating, and supporting solutions built in a large-scale data ecosystem in production environments on-premise, or in Azure/AWS cloud service.
  • Proven experience in successfully building highly complex and scalable data pipelines.
  • Strong understanding of BOKF business information systems and strong Enterprise Data Warehouse concepts.
  • Strong hands-on experience independently designing, developing and testing business intelligence & analytics solutions using proven or emerging technologies in a variety of technologies and environments.
  • Excellent oral and written communication skills to effectively represent self and BOKF, as well as ability to present complex information and issues in a clear and concise manner.
  • Expert conceptual thinking and analytical skills with the ability to analyze complex problems that include interrelationships and dependencies in order to identify common themes and solutions.
  • Ability to understand a goal and build out a work plan to accomplish, adjusting to accommodate other priorities, as needed.
  • Confidently represent Enterprise Data Solutions team in various meetings where business needs and use cases are presented. Provide guidance/next steps with little oversight that are aligned with established best practices, patterns, and architectural principles.
  • Strong sense of accountability, taking ownership over projects and responsibilities, and resolving issues proactively.

Nice To Haves

  • AWS
  • FiveTran
  • dbt
  • Snowflake

Responsibilities

  • Lead the design, implementation, and governance of enterprise ETL / ELT pipelines, leveraging Fivetran for managed ingestion and CDC, Snowflake as the central cloud data platform, and dbt for transformation, modeling, and analytics engineering.
  • Architect and review end‑to‑end data flows from source systems to curated, consumption‑ready datasets, including: Source‑aligned raw ingestion, Conformed and reusable transformation layers, Business‑ready data products optimized for analytics and reporting.
  • Establish and enforce dbt engineering standards, including: Development of modular dbt models (staging, intermediate, mart layers), Creation and reuse of dbt macros for standardization, automation, and consistency, Implementation of dbt tests, documentation, and lineage to ensure data quality and transparency, Version‑controlled dbt projects aligned to enterprise release processes.
  • Design and optimize Snowflake data architecture, including: Warehouse sizing and workload isolation strategies, Partitioning, clustering, and performance tuning, Secure data sharing and access control models, Cost optimization through usage patterns and resource governance.
  • Define and drive Python, PySpark, and SQL standards for data engineering workloads, supporting advanced transformations, large‑scale processing, and streaming or hybrid use cases where appropriate.
  • Implement Infrastructure as Code (IaC) using Terraform to provision and manage Snowflake objects, cloud infrastructure, and supporting platform components in a repeatable and auditable manner.
  • Design and integrate CI/CD pipelines for data engineering assets, including: Automated deployment of dbt models, macros, and tests, Environment promotion (dev → test → prod), Code quality checks, linting, and automated validation, Controlled and traceable releases aligned with enterprise SDLC practices.
  • Oversee and validate Fivetran connector configurations, schema evolution handling, and ingestion SLAs to ensure reliability and trust in source‑to‑target pipelines.
  • Evaluate, approve, and govern open‑source and vendor data engineering tools (Fivetran, dbt, Snowflake, Kafka ecosystem, AWS services) with a focus on scalability, security, maintainability, and cost efficiency.
  • Lead proofs of concept and technical evaluations for new data engineering technologies and patterns, ensuring alignment with Snowflake‑centric, SQL‑first and automation‑driven architecture principles.
  • Establish best practices for pipeline observability, data quality, and operational monitoring, ensuring pipelines are robust, traceable, and production‑ready.
  • Partner with platform, security, and compliance teams to ensure data pipelines, infrastructure, and deployments meet enterprise security, access control, and regulatory requirements without compromising developer productivity.

Benefits

  • Discretionary Bonus
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service