Data Engineer

FaskenToronto, ON
$90,000 - $110,000Remote

About The Position

Fasken is seeking a Data Engineer to join its growing Data Analytics & Engineering team. This role is ideal for a professional who is hands-on, detail-oriented, and passionate about building reliable, well-governed, and scalable data platforms. You will be responsible for designing, building, and maintaining data warehouses and lakehouses in Microsoft Fabric, developing robust data pipelines, ensuring ongoing data integrity, and supporting analytics and governance initiatives across the firm. This role requires strong collaboration skills, a governance-first mindset, and the ability to clearly communicate complex data issues to both technical and non-technical audiences.

Requirements

  • Bachelor’s degree in math, economics, statistics, engineering, computer science, or other quantitative field, or equivalent level of experience
  • 4+ years of experience in data engineering or a closely related role
  • Strong experience with Microsoft Azure–based data platforms, including: Azure services, Microsoft Fabric, SQL Server Other equivalent platforms are also considered: AWS, Snowflake
  • Proficiency in SQL and Python for data transformation and pipeline development
  • Experience building and maintaining: Data warehouses and/or lakehouses Data pipelines and dataflows
  • Familiarity with Power BI and semantic modeling (sufficient to build and support functional reports)
  • Experience working with data governance, access control, and compliance requirements
  • Strong troubleshooting, communication, and collaboration skills
  • Microsoft Certification required (or candidate to obtain Certification within the first year)
  • Strong analytical and problem-solving skills
  • Ability to work in a Remote-first environment, where internal clients may be located in various geographical locations and timezones.
  • Excellent planning and organizational skills, ability to manage multiple files simultaneously and meet established deadlines
  • Professionalism, discretion and commitment to outstanding customer service
  • Autonomy and ability to deal with multiple stakeholders
  • Organizational skills, thoroughness and ability to manage priorities

Nice To Haves

  • Knowledge of the Elite 3E ERP is an asset

Responsibilities

  • Design, build, and maintain data warehouses and data lakehouses in Microsoft Fabric
  • Develop and support ETL/ELT pipelines, dataflows, and data ingestion processes
  • Use notebooks (Python, SQL) to transform, validate, and move data across platforms
  • Build and maintain semantic models to support analytics and reporting
  • Ensure data integrity, accuracy, and consistency throughout all transformations
  • Monitor daily data loads to ensure completeness, freshness, and integrity
  • Maintain and follow a regular operational checklist to monitor data infrastructure health
  • Build Power BI dashboards, reports and alerts to monitor data movement, quality, and system performance
  • Monitor and optimize data platform compute usage to ensure efficient performance and cost control
  • Troubleshoot and resolve data-related issues and business tickets in a timely manner
  • Act as a champion and ambassador for data governance, compliance, and ethical data usage
  • Adhere to strict requirements such as ethical walls, access segregation, and confidentiality controls
  • Participate in and lead data governance initiatives, including: Defining and standardizing data fields Data clean-up and quality improvement efforts
  • Set up and maintain secure access to data resources, leveraging Microsoft Entra ID groups and role-based access control
  • Perform regular production system closing procedures on month end and year end.
  • Collaborate closely with: Data analysts and senior analysts Data governance leadership Technology teams and business stakeholders
  • Translate data issues and limitations into clear, business-friendly language
  • Support ad hoc reporting and data requests from the business
  • Partner with stakeholders to identify pragmatic, compliant, and scalable data solutions
  • Document and or update processes and Standard Operating Procedures (SOP)
  • Experience using GitHub and DevOps practices for version control, collaboration, and deployments
  • Leverage Copilot and modern tooling to optimize Python and SQL code
  • Contribute to improving standards, documentation, and repeatable patterns across the data platform
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service