Data Engineer II

THREAD BANKNashville, TN

About The Position

Who We Are Thread Bank is a digital-first financial technology community bank that aims to enhance customer engagement through innovative solutions. Thread Bank offers a modern website, a CRM system, and a mobile app to simplify banking for businesses and individuals. Our embedded banking solution helps business technology platforms provide secure banking experiences. We also partner with other banks, credit unions, and FinTechs to integrate compliant financial solutions. Thread Bank values innovation, collaboration, and flexibility, offering excellent benefits and a family-friendly culture. What We Are Looking For We are looking for people who thrive in a fast-paced, growth environment while remaining within regulatory boundaries. Thread Bank provides a unique opportunity to be a part of a high growth, cutting edge, fintech startup within the stable and profitable banking industry. This is an excellent opportunity for a professional looking to advance their career as the company grows. What you'll do Design and implement ingestion, transformation, and data warehouse layers — including Snowpipe/COPY INTO configurations for Parquet files from S3 and idempotent MERGE patterns using deterministic hash keys across normalized entity schemas. Build and extend the Raw ? Stage ? Data Warehouse transformation pipeline using Snowflake Streams, Tasks, and/or Dynamic Tables, and accommodate new data sources within the established architecture. Collaborate on the design of Thread Bank's vendor-agnostic canonical data model, covering multi-source entity resolution, deduplication, and cross-source attribution across the embedded banking tech stack. Implement and maintain column-level security, RBAC, and field-level encryption within Snowflake, including KMS-backed key management for sensitive data elements. Monitor and optimize Snowflake compute consumption — warehouse sizing, clustering, result caching, and query profiling — to support cost governance objectives. Partner with third-party implementation partners on the Snowflake migration, evaluating design decisions for extensibility, surfacing gaps early, and owning internal documentation and knowledge capture throughout the engagement. Extend Snowflake Cortex AI agent capabilities to expand AI-assisted querying and reporting against Thread Bank's staged and curated data layers. Produce and maintain data governance documentation — including data lineage, pipeline controls, and data element definitions — in collaboration with Information Security and Compliance stakeholders. Other duties and responsibilities may be assigned, according to the needs of the Bank. Strong critical thinking, analytical, and problem-solving skills.

Requirements

  • 3–5 years of professional data engineering experience, with at least 2 years in a production Snowflake environment.
  • Hands-on Snowflake production experience: Snowpipe, COPY INTO, MERGE INTO/upsert patterns with hash-based deduplication, Streams, Tasks, Dynamic Tables, RBAC, column-level security, and compute cost governance.
  • Experience designing vendor-agnostic canonical schemas across heterogeneous data sources, including multi-source entity resolution and schema evolution handling.
  • AWS data lake familiarity: S3 partitioning, SQS/SNS event notifications, cross-account access, IAM least-privilege design, and CloudWatch observability sufficient to debug Snowpipe ingestion issues.
  • Demonstrated ability to collaborate with infrastructure and data engineering peers across a small, high-velocity team; experience working alongside third-party implementation partners and capturing institutional knowledge.
  • Production-quality pipeline documentation skills: data lineage, transformation logic, and design decision logs.

Nice To Haves

  • dbt Core/Cloud or Snowflake-native transformation tooling
  • Snowflake Cortex AI
  • Python (data engineering context)
  • event-driven data patterns (Kafka, Kinesis, or equivalent)
  • financial services or fintech data environments (BaaS, embedded banking, payments, GLBA NPI/PII)
  • IaC familiarity (CloudFormation or Terraform).

Responsibilities

  • Design and implement ingestion, transformation, and data warehouse layers — including Snowpipe/COPY INTO configurations for Parquet files from S3 and idempotent MERGE patterns using deterministic hash keys across normalized entity schemas.
  • Build and extend the Raw ? Stage ? Data Warehouse transformation pipeline using Snowflake Streams, Tasks, and/or Dynamic Tables, and accommodate new data sources within the established architecture.
  • Collaborate on the design of Thread Bank's vendor-agnostic canonical data model, covering multi-source entity resolution, deduplication, and cross-source attribution across the embedded banking tech stack.
  • Implement and maintain column-level security, RBAC, and field-level encryption within Snowflake, including KMS-backed key management for sensitive data elements.
  • Monitor and optimize Snowflake compute consumption — warehouse sizing, clustering, result caching, and query profiling — to support cost governance objectives.
  • Partner with third-party implementation partners on the Snowflake migration, evaluating design decisions for extensibility, surfacing gaps early, and owning internal documentation and knowledge capture throughout the engagement.
  • Extend Snowflake Cortex AI agent capabilities to expand AI-assisted querying and reporting against Thread Bank's staged and curated data layers.
  • Produce and maintain data governance documentation — including data lineage, pipeline controls, and data element definitions — in collaboration with Information Security and Compliance stakeholders.
  • Other duties and responsibilities may be assigned, according to the needs of the Bank.
  • Strong critical thinking, analytical, and problem-solving skills.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service