About The Position

A large public sector organization is undertaking a major digital transformation initiative to deliver simpler, more efficient, and user-centered services. A dedicated Digital Design and Delivery division has been established to drive innovation, modernization, and adoption of modern data practices across multiple departments. The organization is seeking a Senior Data Engineer to support service innovation, program reviews, and digital transformation initiatives. The role involves working within cross-functional agile teams, collaborating with product owners, designers, and engineers to deliver high-quality data solutions. The ideal candidate will have strong expertise in data engineering and analytics, with the ability to design scalable data architectures, build reliable pipelines, and generate actionable insights. This role combines hands-on engineering with analytical and stakeholder-facing responsibilities.

Requirements

  • Bachelor's degree in Computer Science, IT, or a related field.
  • 5+ years of experience as a Data Engineer and/or Data Analyst.
  • Experience ensuring data quality, security, and governance.
  • 3+ years designing dimensional models (star/snowflake schemas).
  • 3+ years developing dashboards and visualizations (Power BI, Tableau, Python).
  • 5+ years working with diverse data sources (on-prem and cloud).
  • 3+ years experience with data migrations across environments.
  • 2+ years experience with: Git and collaborative workflows
  • 2+ years experience with: CI/CD pipelines
  • 2+ years experience with: Containerization (Docker/Kubernetes)
  • 2+ years experience with: Infrastructure as Code (Terraform, ARM, CloudFormation)
  • 3+ years experience with: SSIS
  • 3+ years experience with: Azure Data Factory
  • 3+ years experience with: API-based data integration
  • Candidates must be authorized to work within Canada.

Nice To Haves

  • Experience in application development (OOP or functional programming).
  • Experience working in large enterprise or public sector environments.
  • Familiarity with databases and tools such as: PostgreSQL, MongoDB, Azure Cosmos DB
  • Familiarity with tools such as: Synapse, Fabric, Informatica, Talend, DBT, Airbyte
  • Exposure to AI/ML workflows (e.g., Databricks, Azure ML).

Responsibilities

  • Design, build, and maintain scalable data pipelines (on-premises and cloud: Azure, AWS, Google Cloud Platform).
  • Develop and optimize dimensional data models (star/snowflake schemas).
  • Integrate data from multiple sources (SQL, NoSQL, APIs, files) ensuring data quality and consistency.
  • Build and maintain ETL/ELT pipelines using tools such as SSIS and Azure Data Factory (ADF).
  • Enhance data workflows for scalability, reliability, and performance.
  • Implement CI/CD pipelines for automated deployment and testing of data solutions.
  • Manage data lakes and data warehouses with proper governance and security controls.
  • Collaborate with cross-functional teams to translate business requirements into data solutions.
  • Develop curated data marts to support analytics and reporting.
  • Analyze datasets to identify trends, patterns, and anomalies.
  • Develop dashboards and reports using Power BI, DAX, or similar tools.
  • Build predictive/descriptive models using Python or R.
  • Communicate insights clearly to both technical and non-technical stakeholders.
  • Support agile delivery and promote data-driven decision-making.
  • Mentor teams on analytics best practices and self-service BI capabilities.

Benefits

  • Mandatory training (privacy, security, workplace conduct, etc.) may be required.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service