About The Position

Kunai builds full-stack technology solutions for banks, credit and payment networks, infrastructure providers, and their customers. Together, we are changing the world’s relationship with financial services. At Kunai, we help our clients modernize, capitalize on emerging trends, and evolve their business for the coming decades by remaining tech-agnostic and human-centered. The Corporate Workspace Technology area is currently hiring for the role of Senior Data Engineer for the modernization of tooling enablement function. As a Senior Data Engineer, you will be responsible for design, modeling, development and management of data warehouse objects in Snowflake data store utilizing effective data pipelines such as Talend, Informatica and APls. The ideal candidate should have the skills listed below but in addition should be a self-driven, dedicated individual who works well in a team and thinks and acts strategically.

Requirements

  • Design, implement data distribution layer using Snowflake REST API.
  • Writing efficient SQL and Python scripts for large dataset analysis and building end to end automation process on a set schedule.
  • Design and develop data ingestion pipeline using Talend/Informatica using industry best practices.
  • Participate in code reviews and provide constructive feedback to improve team deliverables quality
  • Contribute to proof of concept, documentation and best practices for data management and governance within the Snowflake ecosystem.
  • Assist in the maintenance and optimization of Snowflake environments to improve performance and reduce costs.
  • Collaborate with team members to design and implement effective data workflows and transformations.
  • Support the development of data models and ETL processes to ensure high quality data ingestion in cloud data store.
  • Integrate data governance and data science tools with Snowflake ecosystem as per practice.
  • Identify and implement data integrity practices.
  • Identify and optimize new/existing data workflows.
  • Migrate existing data domains/flows from relational data store to cloud data store (Snowflake).
  • Designing, implementing, managing scalable data solutions using Snowflake environment for optimized data storage and processing.

Nice To Haves

  • Power BI, but could be informatics too

Responsibilities

  • Designing, implementing, managing scalable data solutions using Snowflake environment for optimized data storage and processing.
  • Migrate existing data domains/flows from relational data store to cloud data store (Snowflake).
  • Identify and optimize new/existing data workflows.
  • Identify and implement data integrity practices.
  • Integrate data governance and data science tools with Snowflake ecosystem as per practice.
  • Support the development of data models and ETL processes to ensure high quality data ingestion in cloud data store.
  • Collaborate with team members to design and implement effective data workflows and transformations.
  • Assist in the maintenance and optimization of Snowflake environments to improve performance and reduce costs.
  • Contribute to proof of concept, documentation and best practices for data management and governance within the Snowflake ecosystem.
  • Participate in code reviews and provide constructive feedback to improve team deliverables quality
  • Design and develop data ingestion pipeline using Talend/Informatica using industry best practices.
  • Writing efficient SQL and Python scripts for large dataset analysis and building end to end automation process on a set schedule.
  • Design, implement data distribution layer using Snowflake REST API.

Benefits

  • competitive compensation
  • professional development opportunities
  • flexible work arrangements
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service