AI Data Engineer

BMOToronto, ON
Hybrid

About The Position

We accelerate BMO’s AI journey by building enterprise-grade, cloud-native AI solutions. Our team combines engineering excellence with cutting-edge AI to deliver scalable, secure, and responsible solutions that power business innovation across the bank. We enable and accelerate our partners on their AI journeys across the enterprise, helping teams across BMO unlock value at scale. We are engineers, AI practitioners, platform builders, thought leaders, multipliers, and coders. Above all, we are a global team of diverse individuals who enjoy working together to create smart, secure, and scalable solutions that make an impact across the enterprise. Our ambition is bold: deploy our capital and resources to their highest and most profitable use through a digital-first operating model, powered by data and AI-driven decisions.

Requirements

  • 5-7 years of AI software engineering experience, with 3+ recent years in AI/ML engineering, AI agent development, multi-agent systems.
  • Hands-on experience across Microsoft Azure services (designing, deploying, and operating cloud-native systems).
  • Certifications in Azure AI Engineer, python is a plus.
  • Strong background in AI agent ecosystems; Experience designing and maintaining CI/CD pipelines using GitHub Actions and CDK for Terraform.
  • Demonstrated ability to implement monitoring/observability for AI/agent solutions (logging, tracing, metrics, and operational alerting).
  • Proven delivery on multiple AI initiatives—comfortable shaping ambiguity into “the right questions,” crisp requirements, and practical design.

Nice To Haves

  • Experience with Azure AI Foundry / Microsoft “Foundry” tooling in AI solution enablement and governance/tuning workflows.
  • Familiarity with agent taxonomy/labeling approaches and how to apply them to scale standardized development across teams.
  • Background in designing enterprise-grade platform layers (identity, access controls, registry/source-of-truth patterns) for agents.
  • Knowledge of Financial Services industry.

Responsibilities

  • Design and implement reliable, scalable data ingestion and integration pipelines for structured, semi-structured, unstructured data (e.g., databases, files, documents, APIs, events), and multi-modal data, ensuring data is AI ready, governed, secure, and observable.
  • Experience applying data quality, validation, monitoring and testing frameworks in production pipelines.
  • Ensure pipelines follow enterprise governance, access control, and security standards, including role-based access and lineage considerations.
  • Monitor pipeline performance, troubleshoot failures, and optimize cost and throughput.
  • Integrate AI services (e.g., document understanding, content understanding, embeddings, search, LLM APIs) into production data workflows.
  • Build and maintain ETL/ELT pipelines using cloud‑native services and distributed processing frameworks
  • Develop production‑grade services using Python and REST APIs to expose data and AI capabilities.
  • Partner with leadership to clarify expected outcomes/vision and translate them into an executable build plan, architecture decisions, and delivery milestones.
  • Develop feature engineering pipelines to support ML and GenAI use cases, including retrieval‑augmented generation (RAG).
  • Own the development of AI data engineering standards, best practices, and reusable frameworks, driving consistency and quality across teams and platforms.
  • Lead collaboration with cross‑functional teams to ensure clear, consistent definition and alignment of data input and output requirements.

Benefits

  • health insurance
  • tuition reimbursement
  • accident and life insurance
  • retirement savings plans
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service