Analytics Data Engineer (DBT)

Project XToronto, ON
Hybrid

About The Position

We’re looking for a passionate and curious Data Engineer with strong DBT and Python skills and solid experience working with large-scale data solutions. You’ll play a key role in building high-quality, reusable ETL patterns and automating robust, scalable data pipelines that power analytics and business decision-making. Experience with Matillion ETL or Project X’s Data Pipeline Control (DPC) framework is a plus, and exposure to generative AI (GenAI) tools or techniques is considered an asset.

Requirements

  • 2+ years of hands-on experience in data engineering with a focus on Python-based development.
  • Proven experience using DBT (Data Build Tool) for data transformation and modeling
  • Experience with ETL development (Matillion, DPC, or other orchestration tools).
  • Strong SQL skills and ability to work with structured and semi-structured data.
  • Familiarity with cloud data platforms such as Snowflake, Azure, or AWS.
  • Understanding of data quality practices, error handling, and recovery strategies.
  • Ability to interpret mapping documents and accurately implement logic.
  • Experience with version control (e.g., Git) and CI/CD pipelines is a plus.
  • Bonus: Exposure to GenAI, LLMs, or machine learning workflows.

Nice To Haves

  • Experience in data modeling, data governance, or metadata frameworks.
  • Knowledge of data observability tools or building monitoring dashboards.
  • Familiarity with orchestration tools like Airflow or transformation tools like dbt.
  • Consulting experience or comfort working in fast-paced, client-facing environments.

Responsibilities

  • Develop, deploy, and maintain data pipelines using Python and tools such as Matillion ETL or DPC.
  • Collaborate with data architects and analysts to define and meet data integration requirements.
  • Translate mapping documents into production-ready data workflows.
  • Ensure data integrity by implementing quality checks, handling errors, and performing root cause analysis.
  • Apply tiered architecture design patterns (bronze/silver/gold models) to pipeline development.
  • Create or maintain metadata and control structures for visibility and traceability.
  • Work with cloud-based data warehouses (e.g., Snowflake, Azure Synapse, Redshift).
  • Participate in Agile processes, including sprint planning and reviews.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

11-50 employees

© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service