Data Integration Engineer - Contract

SoftchoiceToronto, ON

About The Position

As a senior member of the Data Engineer team, you will be responsible for leading the design, development, and optimization of enterprise-grade data solutions within the Microsoft Fabric ecosystem. This role requires deep technical expertise, strategic thinking, and the ability to mentor junior engineers while collaborating with cross-functional teams. You will work closely with data scientists, analysts, and other stakeholders to ensure seamless data integration, transformation, and storage, enabling data-driven decision-making across the organization. This role is a fixed 6-month contract term.

Requirements

  • Proficient in SnowSQL, stored procedures, tasks, and data loading/unloading (COPY INTO).
  • Experience with Microsoft Fabric: Data Factory pipelines, Dataflow Gen2, and OneLake integration.
  • Expertise in Spark/Python (PySpark) for data transformation, or ELT tools.
  • Advanced SQL knowledge is essential for complex transformations.
  • 5+ years of experience working on complex data integration and pipeline development across a variety of systems.
  • Proven ability to design and orchestrate complex data integrations, leveraging Data Factory, REST APIs, and other enterprise data sources.
  • Leadership in defining and executing scalable ELT strategies using Spark notebooks, Data Factory, and Synapse pipelines.
  • Advanced knowledge of data pipeline optimization, including performance tuning and workload configuration.
  • Experience establishing naming conventions, workspace governance, and CI/CD deployment strategies.
  • Leadership in data governance, including sensitivity classifications, security models, and access control frameworks.
  • Ability to build and maintain monitoring, alerting, and automated testing for ADF pipelines and data assets.
  • Ability to align Microsoft Fabric implementations with enterprise data strategy and long-term growth plans.
  • Hands-on experience using AI coding agents and AI-assisted development tools in a data engineering context; comfort integrating AI into daily development workflows and evangelizing adoption across the team.
  • Strong problem-solving skills and ability to work in a collaborative environment.
  • Excellent verbal, written, communication, presentation, organizational, and interpersonal skills.

Nice To Haves

  • Preferred Microsoft Fabric certification – DP600, DP700.

Responsibilities

  • Design, develop, and maintain data pipelines to move data between Microsoft Fabric (Lakehouse/Warehouse) and Snowflake, often utilizing Fabric Data Factory.
  • Set up connectors, mirroring, or Spark notebooks to sync data from OneLake to Snowflake, including handling incremental loads.
  • Build and optimize data models (Star Schema/Data Vault) within Snowflake and Fabric for high-performance querying.
  • Monitor and optimize Fabric to Snowflake compute usage and ensure efficient data movement.
  • Ensure data security during transfer, using tools like Snowflake RBAC, Dynamic Data Masking, and Fabric's security features.
  • Design and orchestrate complex integrations across Data Factory, REST APIs, and other enterprise data services.
  • Lead scalable ELT strategies using Spark Notebooks, Data Factory, and Synapse Pipelines.
  • Develop reusable, optimized Spark jobs and configure compute pools for cost-efficient performance.
  • Establish naming conventions, workspace governance, and CI/CD deployment strategies across Fabric.
  • Build monitoring dashboards, alerts, and automated testing for pipelines and operations.
  • Leverage AI coding agents and AI-assisted development tools (e.g., GitHub Copilot, Windsurf, or similar) to accelerate data engineering workflows, automate repetitive tasks, and improve code quality across the Fabric ecosystem.

Benefits

  • Flexibility: Plan your workdays in a way that suits you best
  • Award-Winning Workplace: Proudly recognized as a Great Place to Work for 20 consecutive years
  • Inclusive Culture: We are committed to an inclusive culture where every team member can be their authentic self
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service