Intermediate Data Engineer

G AdventuresToronto, ON

About The Position

The Intermediate Data Engineer is highly motivated and experienced; responsible for designing, building and maintaining highly scalable and robust data solutions for our global operations. The role is key in our data-driven organization; working alongside a team of passionate engineers, developers, analysts, and subject matter experts to deliver solutions that transform the way we use data. Working with the latest technologies and frameworks to leverage your creativity and problem-solving skills to overcome the challenges of managing large volumes of data in real-time. Collaborating with cross-functional teams and make a real impact on our organization: You will manage and maintain the automated orchestration and ingestion of data using Fivetran and Airflow MWAA into our Redshift warehouse; assist our BI team by providing support and guidance for our BI tool Looker; build high-quality data models using dbt Cloud with a focus on marketing data and utilize Python, SQL, Git and CI/CD tools to build integrations while maintaining high standards of documentation and coding efficiency.

Requirements

  • University bachelor's degree in computer science, data engineering, or a related technical field, or equivalent extensive practical experience.
  • Minimum of 5+ years of professional data engineering experience.
  • Advanced SQL knowledge is required.
  • Advanced Python knowledge is required.
  • Experience with ETL tools are required
  • Experience with BI/Analytic tools is required
  • Understanding of data governance principles.
  • Experience with major cloud platforms (e.g., AWS, Azure, Google Cloud)
  • Understanding of data architecture methodologies, such as Medallion, Kimball /Star schema modelling.
  • Strong and demonstrated data modelling, data mapping and data analysis experience with meticulous attention to detail.
  • Experience of version control strategies and CI/CD pipelines for automated testing, deployment, and release management.
  • Understanding of data privacy regulations (e.g., GDPR, PIPEDA etc.), and compliance requirements.
  • Excellent problem-solving skills, attention to detail, and ability to work independently and collaboratively across teams.
  • Ability to multitask and prioritize effectively in a fast paced environment.
  • Excellent written and oral communication skills
  • Self starter who appreciates autonomy with minimal guidance
  • Collaborative mindset with cross-functional teams.

Nice To Haves

  • Fivetran and columnar datastores such as Redshift experience a plus.
  • Looker experience a plus.
  • Experience with digital marketing data is a plus, but not required.
  • Experience with data modelling using dbt Cloud is a plus, but not required.

Responsibilities

  • Support the design and implementation of highly scalable, resilient and performant data solutions that feed into a central data warehouse which serves as the single source of truth for the organization. This includes conforming to data architecture methodologies and ensuring their consistent application.
  • Monitor and maintain data pipelines and orchestration to ensure reliable, timely and accurate ingestion from internal and external source systems.
  • Proposing and implementing improvements that enhance system performance, scalability, and overall reliability.
  • Passionate about emerging technologies and industry trends, suggesting innovative and forward-thinking solutions to enhance the company's data environment.
  • Adhere to best practices, coding standards, and design principles. You'll help ensure the delivery of exceptionally clean, efficient, and maintainable data models.
  • Support diagnosing, debugging, and resolving technical issues within a complex data environment involving distributed systems, and critical third-party integrations.
  • Support the Principle Data Engineer in optimizing the data environment; implementing methodologies and automated tests, ensuring robust and reliable solutions, promoting data integrity,accuracy, availability, usability and security.
  • Monitoring the performance of the data warehouse; help diagnose bottlenecks and resources to ensure scalability for the growing data volumes and business requirements.
  • Work with business stakeholders, infrastructure and technical teams to implement technical solutions for efficient and timely delivery of new data sources.
  • Collaborate with subject matter experts to deliver solutions that enable the distribution of data models to BI tools and other data platforms which conform to data engineering principles and compliance with international regulations.
  • Support the business by performing administrative tasks such as performing license updates and permission changes.
  • Contribute with code reviews, providing actionable feedback, and serve as a mentor to other team members.
  • Provide constructive feedback to foster a culture of continuous learning and technical growth.
  • Provide technical guidance to the Data Systems and Guru teams.
  • Contribute to high-quality documentation for our data environment, including system designs, and technical procedures, ensuring knowledge transfer across the Data Systems team.

Benefits

  • Competitive Total Rewards Package, including exclusive travel perks!
  • Additional days off, including on your birthday!
  • Vacation time for you to recharge
  • Enhanced Parental Leave
  • Meaningful Employee Recognition Program
  • Learning and Growth Opportunities
  • Employee Resource Groups Applicable based on location
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service