About The Position

Build, refactor, and support enterprise data pipelines for data collection, transformation, and delivery Develop and maintain Snowflake -based data solutions using Snowpark, Python, and SQL Migrate existing Azure Data Factory pipelines into Snowflake Snowpark solutions Join and transform data from multiple source systems for reporting, dashboards, KPIs, and analytics use cases Implement infrastructure that supports secure data storage, processing, and retrieval in Snowflake Execute work from defined requirements, technical designs, and priorities set by team leads and architects Identify delivery risks, technical issues, or blockers and escalate as needed Manage assigned tasks and deliverables against project timelines and sprint commitments Apply performance tuning and optimization across Python and SQL workflows Use AI -assisted development tools to support coding, refactoring, debugging, and documentation while following engineering standards Validate AI -generated output to ensure security, quality, performance, and governance requirements are met Share AI tool usage patterns and best practices with the broader engineering team

Requirements

  • 3 to 5+ years of experience in Data Engineering or Data Integration roles
  • Strong hands -on experience with Snowflake in a production environment
  • Strong hands -on experience with Snowpark pipeline development
  • Senior -level Python skills for data engineering and integration workloads
  • Advanced SQL skills, including complex transformations and query tuning
  • Experience working with Azure Data Factory and translating pipeline logic into Python -based implementations
  • Experience migrating ETL or data pipelines across cloud platforms, especially from Azure to Snowflake
  • Experience working with REST APIs using Python
  • Experience in Agile/Scrum teams with sprint -based delivery
  • Understanding of data security, governance, and enterprise engineering standards
  • Bachelor’s degree or equivalent practical experience

Nice To Haves

  • Snowflake Tasks and Streams
  • Snowflake warehouse configuration and optimization
  • AWS experience
  • Azure experience
  • CI/CD for data engineering workflows
  • Experience with large -scale cloud data migration projects

Responsibilities

  • Build, refactor, and support enterprise data pipelines for data collection, transformation, and delivery
  • Develop and maintain Snowflake -based data solutions using Snowpark, Python, and SQL
  • Migrate existing Azure Data Factory pipelines into Snowflake Snowpark solutions
  • Join and transform data from multiple source systems for reporting, dashboards, KPIs, and analytics use cases
  • Implement infrastructure that supports secure data storage, processing, and retrieval in Snowflake
  • Execute work from defined requirements, technical designs, and priorities set by team leads and architects
  • Identify delivery risks, technical issues, or blockers and escalate as needed
  • Manage assigned tasks and deliverables against project timelines and sprint commitments
  • Apply performance tuning and optimization across Python and SQL workflows
  • Use AI -assisted development tools to support coding, refactoring, debugging, and documentation while following engineering standards
  • Validate AI -generated output to ensure security, quality, performance, and governance requirements are met
  • Share AI tool usage patterns and best practices with the broader engineering team
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service