About The Position

We are seeking a highly experienced Senior Data Engineer to support a large-scale Snowflake migration project for a regulated financial environment (OCC). This role requires a hands-on expert who can independently design, develop, and support modern data pipelines while translating legacy ETL systems into scalable, production-ready Snowflake solutions. The ideal candidate will bring deep expertise in Snowflake, strong SQL and pipeline engineering skills, and a proven track record of working in production environments with high accountability, auditability, and operational ownership.

Requirements

  • 7+ years of experience in data engineering / ETL development
  • Strong hands-on experience with Snowflake (data modeling, performance tuning, SnowSQL, Snowpipe, etc.)
  • Advanced proficiency in SQL and pipeline design
  • Proven experience migrating legacy ETL systems (SAS, Informatica, or similar) to modern cloud-based platforms
  • Experience building metadata-driven frameworks with logging, auditing, and control layers
  • Strong experience in production support environments , including monitoring, incident management, and recovery
  • Familiarity with regulated environments (financial services preferred) with strict data governance and compliance requirements
  • Experience with orchestration tools (e.g., Airflow, Azure Data Factory, etc.)
  • Strong problem-solving skills and ability to work independently at a senior level

Nice To Haves

  • Experience working on Snowflake migration projects in large enterprises
  • Background in SAS-based data environments
  • Knowledge of data governance, lineage, and compliance frameworks
  • Experience with cloud platforms (AWS, Azure, or GCP)
  • Familiarity with DevOps practices in data engineering

Responsibilities

  • Design, develop, and optimize end-to-end data pipelines using Snowflake and modern data engineering tools
  • Translate and modernize legacy ETL workflows (SAS or similar) into scalable Snowflake-based architectures
  • Build and maintain metadata-driven ETL frameworks with strong auditability, lineage, and control mechanisms
  • Write and optimize complex SQL queries for performance and scalability within Snowflake
  • Ensure data quality, integrity, and governance across all pipelines
  • Provide production support , including troubleshooting, root cause analysis, and issue resolution in a regulated environment
  • Collaborate with cross-functional teams including Data Architects, Analysts, and Business stakeholders
  • Implement best practices for CI/CD, version control, and automated testing in data pipelines
  • Participate in code reviews, design discussions , and continuous improvement initiatives
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service