Senior Data Engineer

Questrade Financial GroupToronto, ON
Hybrid

About The Position

Questrade Financial Group (QFG) is seeking a Senior Data Engineer to build and scale the Flexiti Self-Serve Data Platform. This role is crucial for implementing Data Fabric and Data Mesh architectures, ensuring data accessibility, security, and readiness for advanced Agentic AI applications. The Senior Data Engineer will translate raw data into business value by developing high-performance Data Products that accelerate decision-making and revenue generation. This position involves designing and implementing technical capabilities for a decentralized engineering model, leading the transition to AI-augmented workflows using Generative AI for code generation and testing, and embedding within business units as a "Forward-deployed Engineer" to solve data challenges at the source.

Requirements

  • Extensive experience scaling data solutions on Hyperscalers, with a strong preference for GCP.
  • Deep proficiency in GCP BigQuery, GCP Vertex AI, Spark, Databricks, and Data Mesh architectures.
  • Expert-level Python, PySpark, and SQL.
  • Strong knowledge of CI/CD, software supply chains, and MLOps principles.
  • Familiarity with visualization tools such as PowerBI, Sigma, or Looker.
  • Experience with Terraform or CloudFormation.
  • Strong problem-solving abilities and attention to detail.
  • Excellent communication skills to translate technical concepts for non-technical audiences.
  • Ability to work independently and as part of a distributed team.
  • Strong documentation skills for architectural designs and technical workflows.
  • Self-motivated and driven.
  • Proven background in Financial Services or other highly regulated industries.
  • Experience participating in large-scale migrations from legacy systems to modern Cloud environments.
  • Bachelor’s or Master’s degree in Computer Science, Mathematics, Statistics, or a related field.

Responsibilities

  • Build and maintain scalable data pipelines using Spark and Databricks to unify structured and unstructured data across AWS/GCP and other cloud-native data platforms.
  • Develop the "Data Producer" and "Data Exchange" interfaces for autonomous data sharing and consumption by business units.
  • Implement platform guardrails and deployment patterns using Policy as Code and IaC Factory patterns.
  • Write highly efficient code and monitor compute patterns to maximize ROI and minimize cloud consumption costs (FinOps Optimization).
  • Build the underlying data infrastructure required to operationalize autonomous data agents and next-generation analytics (Agentic AI Support).
  • Lead by example in adopting prompt engineering and GenAI tools to automate documentation, testing, and code refactoring (AI-Augmented Engineering).
  • Implement AI/ML-based monitoring tools for proactive anomaly detection and self-correcting data quality workflows.
  • Ensure every pipeline meets rigorous financial regulatory standards by integrating security and compliance features directly into the build process (Built-in Security).
  • Contribute to the design and maintain the Enterprise Data Catalog & Data Exchange for data discoverability and documentation across all business lines (Metadata Management).
  • Partner with Product Owners and business stakeholders to translate complex engineering requirements into actionable business outcomes (Consultative Leadership).

Benefits

  • Health & wellbeing resources and programs
  • Paid vacation, personal, and sick days
  • Competitive compensation and benefits packages
  • Hybrid environment with at least 3 days in office
  • Career growth and development opportunities
  • Opportunities to contribute to community causes
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service