Sr. Snowflake Data Engineer, Onsite, AVP

State StreetPrinceton, NJ
Hybrid

About The Position

We are looking for a Sr. Snowflake Data Engineer with strong experience in design, data modeling, development, and performance tuning. The candidate will be responsible for building Cloud-based DW solutions for F2B clients. This role involves designing, building, and testing end-to-end data pipelines, including data ingestion (streaming, events, and batch), data integration, and data curation. The engineer will also design, develop, and deploy scalable data pipelines and ETL processes on cloud-based infrastructure using Azure, Snowflake, DBT, Airflow, and Cosmos DB. Automation of jobs and testing, optimization of data pipelines for various workloads and use cases, and support for mission-critical applications with near real-time data needs are key responsibilities. Additionally, the role requires addressing data and environment issues, performing impact and root cause analysis, and implementing corrective, adaptive, and perfective maintenance. The engineer will also implement data models, transformations, and schema designs to support analytical and reporting needs, and optimize Snowflake performance through query optimization, resource management, and scaling strategies.

Requirements

  • More than 7+ years of experience in IT
  • Strong experience in design/data modeling/development & performance tuning
  • Experience in micro-services architecture and understanding of Cloud Computing is highly desirable
  • Experience with MicroServices - API/Event Driven Architecture/Development
  • Strong hands-on experience in troubleshooting DevOps pipelines and Azure services
  • Experience with DBT (Data Build Tool) for data modelling and transformation and Apache Airflow for workflow orchestration and scheduling
  • Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
  • Experience in re-clustering Snowflake data with good understanding of Micro-Partition within Snowflake
  • Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns
  • Experience in handling semi-structured data (JSON, XML) in Snowflake
  • Bachelor’s Degree level qualification in a computer or IT related subject
  • 7+ years of experience in professional database and data warehouse development
  • 7+ years of development experience writing SQL queries/Stored Procedures for any another Relational Databases like Oracle, SQL Server
  • 5+ years of experience on Snowflake development including Snowpipes, Snowshare, Task, Streams, UDFs and Procedures
  • 5+ years of experience on cloud-based development including Azure Services, Azure DevOps, Kubernetes, Docker
  • Ability to communicate effectively and in a professional manner both written and orally
  • Be a team player with a positive attitude, enthusiasm, initiative, and self-motivation
  • Ability to multi-task, meet aggressive timelines and have strong work ethics
  • Experience of working in the financial industry
  • Experience with agile development methodology

Nice To Haves

  • Experience in Snowflake utilities including SnowSQL, Snowpipe, Snowlight for handling Streaming data is a plus
  • Any development experience on Databricks or Scala is a plus
  • Exposure to working in financial services; front- or middle-office, or fund management business areas ideal
  • Experience dealing with users or clients from either a technical or business area
  • Desire to work in client facing environment

Responsibilities

  • Design, build and test end to end data pipeline including data ingestion (streaming, events, and batch), data integration, data curation
  • Design, develop, and deploy scalable data pipelines and ETL processes on cloud-based infrastructure using Azure, Snowflake, DBT, Airflow, Cosmos DB
  • Define and implement automation of jobs and testing
  • Optimize the data pipeline to support workloads and use cases
  • Support mission critical applications and near real time data needs from the data platform
  • Address data issues, environment issues, performing impact analysis, root cause analysis, corrective, adaptive and perfective maintenance
  • Implement data models, transformations, and schema designs to support analytical and reporting needs
  • Optimize and tune Snowflake performance, including query optimization, resource management, and scaling strategies

Benefits

  • retirement savings plan (401K) with company match
  • insurance coverage including basic life, medical, dental, vision, long-term disability, and other optional additional coverages
  • paid-time off including vacation, sick leave, short term disability, and family care responsibilities
  • access to our Employee Assistance Program
  • incentive compensation including eligibility for annual performance-based awards
  • eligibility for certain tax advantaged savings plans
  • flexible work-life support
  • paid volunteer days
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service