Principal Data Scientist

Questrade Financial GroupToronto, ON
Hybrid

About The Position

Questrade Financial Group (QFG), through its companies - Questrade, Questbank, Questrade Wealth Management, Community Trust Company, Zolo, and Flexiti, provides securities and foreign currency investment, professionally managed investment portfolios, mortgages, real estate services, financial services and more. We use cutting-edge technology to help Canadians become much more financially successful and secure. At QFG, we combine human-centric collaboration with AI-driven innovation to redefine financial services. The ideal candidate will be a catalyst for change, using AI to transform and deliver unparalleled customer experiences and shaping a future where AI empowers our teams to do their best work. Join our diverse, inclusive, and hybrid workplace to unleash your creativity and nurture your curiosity without limits. If you share this sense of infinite possibility, come shape your future at QFG. The Alternative Investments Team is launching new crypto and alternative product capabilities. We're looking for a subject matter expert (SME) for Data and AI Engineering within Growth Portfolio who can take us from 0→1 on the data value chain by implementing and maintaining flows into BigQuery, build compliance-grade reporting, establish evergreen dashboards, and enable self-serve analytics for the team. We are not starting from scratch: Alternative Investing is already live with Precious Metals trading, and we have existing patterns for shipping data for analytics, creating views, and building dashboards/reporting pipelines. The expectation is that you build on those patterns and accelerate—then push us into the next maturity level. Once the foundation runs itself, you shift into higher-leverage work: predictive analytics and forecasting that sharpen business strategy, data-driven experimentation that makes our products better, and AI-powered automation that makes our operations faster. You resolve complex technical challenges within the existing data framework, adapt and optimize AI tools to meet evolving compliance and reporting requirements for crypto products.

Requirements

  • Strong SQL + Python and a track record owning production data pipelines and analytics outputs end-to-end
  • Experience with modern cloud data analytics platforms (GCP/BigQuery, Databricks, Snowflake, etc.)
  • Strong data modeling instincts and the ability to create stable metric definitions
  • Experience building dashboards that people actually use and evolving them with the business
  • Production software engineering: you write code that ships, you collaborate on codebases via Git, and you're comfortable with CI/CD and container orchestration (Docker, Kubernetes/GKE)
  • Comfort operating in regulated environments with strong privacy/PII awareness and auditability discipline
  • Exceptional written and verbal communication skills; you can translate "data speak" into business decisions
  • High autonomy: you can prioritize, ship thin slices, and navigate enterprise constraints without stalling
  • AI-first mindset: you have concrete examples of using AI to accelerate your engineering/analytics work, automate repetitive tasks, and continuously improve your own workflow.

Nice To Haves

  • Product analytics instrumentation/experimentation (e.g. Amplitude)
  • Time-series forecasting or operational modelling
  • Experience with GCP data services beyond BigQuery (e.g., Dataflow, Composer/Airflow, Pub/Sub) or a similar suite in AWS/Azure
  • Success enabling self-serve analytics / data democratization in complex orgs
  • Enterprise or hobby projects building LLM applications with an Agent Framework (e.g. LangChain/LangGraph, Google's ADK, Open-Claw, etc.) Bonus points for explaining how your evals, traces, and LLMOps make it robust!

Responsibilities

  • Reporting & compliance foundation (near-term): Build and operate daily reconciliation and supervision/surveillance reporting flows for crypto, derivatives, and precious metals products and business lines; ensure outputs are reliable, traceable, and delivered via approved secure mechanisms.
  • Data integration & modeling: Ingest, model, and curate core datasets in BigQuery; define metric definitions with clear grains and ownership.
  • Dashboards & monitoring: Execute the full lifecycle from data pipeline to shipped dashboard. Whether it's a Looker explore for business users, a Grafana panel for real-time ops, or a custom PyShiny app for interactive decision-making—you pick the right tool, build it, deploy it, and keep it running. We care about how you think and that the result actually provides value.
  • Self-serve enablement: Build and maintain queryable tables/views, documentation, interfaces, and examples so internal users can answer questions, explore data, research ideas, and generate new insights. Rather than becoming the analyst bottleneck—you make data insights possible without one.
  • Production engineering & delivery: Write and ship production code. Collaborate on shared codebases with software engineers, deploy through CI/CD pipelines, and monitor services on scalable infrastructure (GKE/Kubernetes). You're comfortable with end-to-end delivery of simple full-stack solutions for internal users.
  • Stakeholder partnership & problem solving: Engage directly with compliance, ops, product, and business leaders with an owner's mindset to understand their needs, diagnose issues, and deliver solutions. You don't wait for a ticket; you solicit feedback, discover problems, and fix them.
  • Automation & AI that compounds: Automate recurring workflows, shrink manual operational load, and then turn that same instinct on your own process. Use AI to draft pipelines, generate tests, explore data faster, and rapidly prototype solutions that would have taken weeks in 2023. Your goal is to make the "must-have" work boring so you can spend your time on the work that actually moves the business.
  • Stack pragmatism: Our baseline today is BigQuery + Looker + Python microservices in GKE. We care about outcomes, not stale patterns. If you see a better way, you have the autonomy to shape direction, and the pragmatism to work with legacy systems when that's the fastest path.

Benefits

  • Health & wellbeing resources and programs
  • Paid vacation, personal, and sick days for work-life balance
  • Competitive compensation and benefits packages
  • Work-life balance in a hybrid environment with at least 3 days in office
  • Career growth and development opportunities
  • Opportunities to contribute to community causes
  • Work with diverse team members in an inclusive and collaborative environment

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

11-50 employees

© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service