Python/AI Engineer

RBCToronto, ON
Onsite

About The Position

We are looking for a Data Engineering Tools Developer to build internal tools, frameworks, and developer utilities that improve the productivity and reliability of our Data Engineering teams. This role focuses on developing Python-based frameworks, Streamlit applications, and reusable engineering tools that enable faster development, monitoring, and governance of enterprise data pipelines. You will work closely with data engineers, platform engineers, and data architects to standardize how data pipelines are built, tested, deployed, and monitored across the organization. The ideal candidate combines strong Python development skills with experience in data engineering environments and enjoys building platforms and tools that empower other engineers.

Requirements

  • Bachelor’s degree in Computer Science, Software Engineering, or related technical field.
  • 4–7 years of experience in software engineering or data engineering roles.
  • Strong proficiency in Python development.
  • Experience building data pipelines and data processing frameworks.
  • Experience developing internal engineering tools or reusable libraries.
  • Experience with Streamlit or similar frameworks for building data tools and dashboards.
  • Strong knowledge of SQL and working with large datasets.
  • Experience with source control (Git), testing frameworks, and CI/CD pipelines.
  • Experience working in Agile engineering environments.

Nice To Haves

  • Experience with data orchestration tools such as Apache Airflow, Prefect, or Dagster.
  • Familiarity with modern data platforms such as Snowflake, Databricks, or cloud data lakes.
  • Experience with data transformation frameworks such as dbt.
  • Knowledge of data quality and data governance tooling.
  • Experience working in financial services or banking environments.
  • Familiarity with containerization technologies such as Docker.
  • Design & Develop AI engineering using Cortex.

Responsibilities

  • Design and build internal tools and frameworks that accelerate data pipeline development and deployment.
  • Develop Python libraries and reusable components for data ingestion, transformation, testing, and monitoring.
  • Build interactive developer tools and dashboards using Streamlit to simplify pipeline monitoring, troubleshooting, and data exploration.
  • Create frameworks for data validation, schema enforcement, and automated pipeline testing.
  • Develop utilities that improve data pipeline observability, logging, and operational support.
  • Collaborate with data engineers to standardize pipeline development patterns and best practices.
  • Integrate tools with data orchestration platforms, CI/CD pipelines, and cloud data platforms.
  • Build automation for data platform operations, metadata management, and pipeline governance.
  • Support the adoption of data engineering best practices and engineering standards across the organization.
  • Troubleshoot and optimize tools to ensure performance, scalability, and reliability in production environments.

Benefits

  • A comprehensive Total Rewards Program including bonuses and flexible benefits, competitive compensation, commissions, and stock where applicable
  • Leaders who support your development through coaching and managing opportunities
  • Ability to make a difference and lasting impact
  • Work in a dynamic, collaborative, progressive, and high-performing team
  • A world-class training program in financial services
  • Flexible work/life balance options
  • Opportunities to do challenging work
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service