About The Position

Supporting CDMARS (Commercial Delivery Model and Advisor Strategy) Data Governance & Data Strategy, we are seeking an experienced Data Engineer to join our team and lead the implementation, migration and optimization of our Commercial Data Analytics Cloud Data Lakehouse built on Snowflake. You'll be instrumental in architecting a modern data platform that unifies structured and semi-structured data storage with advanced analytics capabilities. This role offers the opportunity to work with cutting-edge Lakehouse technologies and establish the foundation for enterprise-wide data initiatives, enabling self-service analytics and supporting business intelligence across the organization.

Requirements

  • 4+ years of experience in data engineering with at least 2 years of hands-on Snowflake experience (including architecture, performance tuning, security features, and cost optimization strategies)
  • Advanced SQL skills with experience in complex query optimization and Snowflake-specific SQL features
  • Cloud data integration experience using tools like Fivetran, Matillion, Talend, or custom solutions for data ingestion and Cloud platform knowledge particularly Azure Blob Storage, AWS S3, or GCP Cloud Storage for external data staging
  • Data modeling expertise with dimensional modeling, data vault, or other modern data architecture patterns
  • Scripting and automation proficiency in Python, JavaScript (for Snowflake stored procedures), and shell scripting
  • Version control and deployment experience with Git, CI/CD pipelines, and infrastructure as code practices

Nice To Haves

  • Data orchestration tools experience with Airflow, Prefect, or dbt for workflow management and data transformation
  • Data lake technologies experience with Delta Lake, Databricks, or other lakehouse platforms
  • Streaming data integration using Snowflake Streams, Kafka connectors, or real-time data ingestion patterns
  • Modern data stack experience with tools like dbt, Looker, Tableau, or other analytics platforms integrated with Snowflake
  • Multi-cloud experience with data integration across AWS, Azure, and GCP environments
  • Snowflake certifications (SnowPro Core, Advanced Data Engineer, or Architecture)

Responsibilities

  • Architect and implement scalable data lakehouse solutions using Snowflake, establishing data zones (raw, curated, analytics) and optimizing storage and compute performance while developing ELT/ETL Pipelines that will build a robust environment with Snowflake native features (Snowpipe, Tasks, Streams) and external orchestration tools to process batch and streaming data from various sources
  • Design and implement dimensional models, data vault architectures, and modern data modeling patterns optimized for Snowflake's unique architecture and performance characteristics
  • Lead data migration efforts from legacy systems to Snowflake; integrate data from various sources including databases, cloud storage, SaaS applications, and real-time streams; be a reference on Don’t Repeat Yourself methodology, using scalable strategy to build lean processes that can be re-used
  • Implement data governance frameworks, access controls, data lineage tracking, and ensure compliance with data privacy regulations within the Snowflake environment; monitor and optimize Snowflake warehouse performance by implementing clustering strategies, managing data lifecycle policies, and optimizing costs through efficient resource utilization
  • Contribute to Overall effectiveness of Commercial Banking Data Insights & Analytics
  • Work closely with data analysts, business intelligence developers, and data scientists to enable self-service analytics and provide technical guidance on best practices; Liaise with partner groups, help in the coordination of partnership with DA (Distribution Analytics), Borealis, PMR (Performance Management Reporting) and other analytics groups.
  • Facilitate the turn-over of data to analytics and insights;
  • Lead efforts on identifying new technology, execution on opportunities to improve productivity, and development of data & governance roadmaps ;
  • Works as an integral part of the DI&A team, leading and mentoring team members on new value-added approaches and fostering a culture of innovation and collaboration;

Benefits

  • Ability to make a difference and lasting impact
  • Work in a dynamic, collaborative, progressive, and high-performing team
  • Opportunities to do challenging work
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service