Senior Data Analytics Engineer

BMOToronto, ON

About The Position

The Senior Data Engineering & Analytics Specialist is responsible for designing, building, and optimizing scalable data platforms and analytical solutions that enable high-quality, data-driven decision-making. This role supports the full lifecycle of enterprise data assets—ranging from ingestion and transformation to modeling, analytics, and visualization—across hybrid cloud and on‑prem environments. The ideal candidate combines strong data engineering expertise, deep SQL Server dashboarding, cloud knowledge, and practical analytics skills, enabling them to translate complex business requirements into reliable data pipelines, performant data models, and actionable insights.

Requirements

  • Strong programming skills in Python, SAS, and SQL.
  • Experience with Power BI, including DAX and M Code.
  • Proficiency with Microsoft 365 tools: Office, Power Automate, SharePoint, OneDrive.
  • Advanced SQL Server configuration for analytical workloads.
  • Experience designing high-performance data structures (partitioning, indexing, columnstore).
  • Strong understanding of backup, recovery, and disaster recovery strategies.
  • Hands-on experience building enterprise-grade ETL pipelines for large datasets.
  • Proven experience with dimensional modeling (star/snowflake, facts/dimensions, SCDs).
  • Strong understanding of data integration, data warehousing, and enterprise data management.
  • Ability to translate complex business requirements into scalable data solutions.
  • Strong experience with AWS, including Redshift, Glue, and exposure to MLOps concepts.
  • Familiarity with Apache Spark, Hadoop, and modern data lake architectures.
  • Experience preparing and modeling data for analytics, reporting, and data visualization
  • Strong analytical and problem-solving skills, with a data-driven mindset.
  • Ability to derive insights from data and communicate findings effectively.
  • Typically 4–6+ years of relevant experience in data engineering, analytics, or related fields.
  • Post-secondary degree in a related discipline or an equivalent combination of education and experience.
  • Demonstrated ability to work independently on complex initiatives while collaborating effectively across teams.

Nice To Haves

  • Additional programming languages: Java, JavaScript.
  • Experience with AI/ML workflows (e.g., AWS SageMaker).
  • Knowledge of data security, encryption, compression, and privacy best practices.
  • Familiarity with data governance frameworks, metadata management, and data quality controls.

Responsibilities

  • Data Engineering & Pipeline Development Design, build, and maintain robust, scalable ETL/ELT pipelines to ingest, transform, and load data from diverse sources, including IBM Netezza, and cloud-based platforms.
  • Create and maintain optimal data pipeline architectures that support large-scale analytical workloads and evolving business needs.
  • Implement incremental loads, change data capture (CDC), and data staging strategies to ensure process efficiency, reliability, and data integrity.
  • Identify and implement process improvements, including automation, performance optimization, and infrastructure redesign for scalability.
  • Database & Platform Engineering Configure and optimize SQL Server environments for high-throughput analytical use cases, including parallel query execution, and indexing strategies.
  • Design and implement partitioned tables, indexed views, and columnstore indexes to support large datasets and complex analytical queries.
  • Manage and support SQL Server recovery models (Simple, Full, Bulk-Logged), including backup/restore strategies, log management, and disaster recovery planning.
  • Support hybrid cloud and on‑prem data platforms, ensuring secure, efficient, and cost-effective data access.
  • Data Architecture & Modeling Design and maintain star and snowflake schemas, fact/dimension models, and slowly changing dimensions (SCDs).
  • Translate business requirements into scalable, analytics-ready data models.
  • Apply strong understanding of RDBMS, NoSQL concepts, and data formats such as CSV, Parquet, and JSON.
  • Partner with data governance and data strategy teams to improve data quality, consistency, and usability.
  • Analytics, Reporting & Insights Develop and maintain Power BI dashboards using DAX and M Code to deliver actionable insights into customer behavior, operational performance, and key business metrics.
  • Apply data analytics techniques to identify trends, anomalies, and opportunities for optimization.
  • Collaborate with stakeholders to understand analytical needs and support data-driven decision-making.
  • Collaboration & Stakeholder Support Work closely with business partners, analysts, and technical teams to support data-related initiatives and resolve complex issues.
  • Communicate technical concepts clearly to both technical and non-technical audiences.
  • Exercise sound judgment to independently solve complex problems within established standards and governance frameworks.

Benefits

  • BMO also offers health insurance, tuition reimbursement, accident and life insurance, and retirement savings plans.
  • To view more details of our benefits, please visit: https://jobs.bmo.com/global/en/Total-Rewards

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

5,001-10,000 employees

© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service