Cogeco-posted 5 days ago
Full-time • Mid Level
1,001-5,000 employees

Our culture lifts you up—there is no ego in the way. Our common purpose? We all want to win for our customers. We aim to always be evolving, dynamic, and ambitious. We believe in the power of genuine connections. Each employee is a part of what makes us unique on the market: agile and dedicated. Time Type: Regular Job Description : JOB SUMMARY: Reporting to the Lead Data Engineering, the Data Engineering Specialist is responsible for designing, developing, and maintaining data integration and transformation processes in our cloud-based data platform. While experience in Google Cloud Platform (GCP) is a significant asset, candidates with proven expertise in other major cloud platforms (AWS, Azure) will also be considered. This role emphasizes data governance, classification, and compliance—leveraging tools such as Collibra to ensure high-quality, secure, and well-documented data assets. Relevant du Gestionnaire, Ingénierie des Données, le Spécialiste en Ingénierie de données est responsable de la conception, du développement et de la maintenance des processus d’intégration et de transformation des données sur notre plateforme de données basée sur le cloud.

  • Data Integration & Architecture · Develop and orchestrate data pipelines for ingestion from various sources (e.g. MySQL, Oracle, PostgreSQL, flat files…etc.) into a cloud-based environment and move data around multiple system based on the business needs and requirements. · Collaborate with Data Analysts and Data Architects on defining data models, requirements, and architecture for optimal performance in databases (e.g. BigQuery or other cloud-based relational databases). · Ensure robust ETL/ELT processes that support scalability, reliability, and efficient data access.
  • Data Governance & Classification · Implement and maintain data governance frameworks and standards, focusing on data classification, lineage, and documentation. · Utilize Collibra or similar platforms to manage data catalogs, business glossaries, and data policies. · Work closely with stakeholders to uphold best practices for data security, compliance, and privacy.
  • Process Improvement & Automation · Identify, design, and implement process enhancements for data delivery, ensuring scalability and cost-effectiveness. · Automate manual tasks using scripting languages (e.g., Bash, Python) and · Enterprise scheduling/orchestration tools like Airflow. · Conduct root cause analysis to troubleshoot data issues and implement solutions that enhance data reliability.
  • Cross-Functional Collaboration · Partner with cross-functional teams (IT, Analytics, Data Science, etc.) to gather data requirements and improve data-driven decision-making. · Provide subject matter expertise on cloud data services, data classification standards, and governance tools. · Monitor and communicate platform performance, proactively recommending optimizations to align with organizational goals.
  • Experience with at least one major cloud platform (AWS, Azure, GCP), with GCP exposure considered a significant asset.
  • Strong understanding of RDBMS (PostgreSQL, MySQL, Oracle, SQL Server) with the ability to optimize SQL queries and maintain database performance.
  • Familiarity with version control systems (Git) to manage codebase changes and maintain a clean development workflow.
  • Familiarity with data governance and classification concepts, leveraging Collibra or similar platforms to manage data lineage, business glossaries, and metadata.
  • Knowledge of Linux/UNIX environments, and experience working with APIs (XML, JSON, REST, SOAP).
  • Demonstrated ability to build large-scale, complex data pipelines for ETL/ELT processes.
  • Hands-on experience with scripting/programming languages (e.g., Python, Bash) to automate data workflows and error handling.
  • Strong analytical and problem-solving skills with the ability to work with unstructured datasets.
  • Functional knowledge of encryption technologies (SSL, TLS, SSH) and data protection measures.
  • Experience implementing governance best practices to ensure data security and regulatory compliance.
  • Excellent communication and collaboration skills to partner effectively with cross-functional teams.
  • Curiosity and a growth mindset, with the initiative to explore emerging data technologies.
  • Bachelor’s degree in Information Technology, Computer Science, or a related field; or an equivalent combination of education and experience.
  • 5 years of progressive experience in data engineering, data analytics, or a similar role.
  • Proven track record in architecting, optimizing, and delivering enterprise-grade data solutions on a major cloud platform (AWS, Azure, or GCP).
  • Demonstrated commitment to continuous learning and improvement in data engineering methodologies.
  • Bilingualism (written and spoken) is an asset to interface with stakeholders in Ontario and across the United-States.
  • Flexibility: Yes, we think that what you do matters. At work and at home.
  • Fun: We laugh a lot, it makes every day brighter.
  • Discounted services: We provide amazing services to our clients, and you’ll get them at home, because you deserve them.
  • Rewarding Pay: Let's be honest, everybody likes to make a good salary. We offer attractive compensation packages, and it comes with a great culture.
  • Benefits: We’ve got you covered.
  • Career Evolution: Join us and we will give you the tools to achieve your career goals!
  • Technology: You have a passion for technology? Excellent, we do too. Here, you will manage, influence, play, create, fix, and shape the industry.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service