Specialist Software Development

Canadian National Railway CompanyMontreal, QC
Hybrid

About The Position

At CN, everyday brings new and exciting challenges. You can expect an interesting environment where you’re part of making sure our business is running optimally and safely―helping keep the economy on track. We provide the kind of paid training and opportunities that long-term careers are built on and we recognize hard workers who strive to make a difference. You will be able to thrive in our close-knit, safety-focused culture working together as ONE TEAM. The careers we offer are meaningful because the work we do matters. Join us! Job Summary The Specialist, Data Developer is responsible for building, handling, and optimizing data pipelines. The role moves them effectively into production for key data and analytics consumers, shapes the enterprise Data as a Service (DaaS) model and delivers on Information and Technology (I&T) business models. Moreover, the incumbent develops best practices and optimizes data pipelines to deliver products and services aligned with business expectations. The position plays a pivotal role in operationalizing data and analytics initiatives, defining and building CN’s data integration and DaaS platform roadmap.

Requirements

  • Minimum 5 years overall work experience
  • Minimum 3 years of experience in a Data Development role, working in different data management disciplines including data integration, modelling, optimization, and quality
  • Experience working in cross-functional teams and collaborating with business stakeholders in support of a departmental or multi-departmental data management and analytics initiative
  • Experience in translating business requirements into advanced data models able to fulfill Analysts and Data Scientists’ requirements
  • Experience working in an Agile team environment
  • Inspires others with impactful communications and adapts to the audience through speech and writing
  • Applies analytical thinking
  • Innovates through problem solving
  • Knows the business and stays current with industry trends to elevate expertise and work
  • Demonstrates organizational abilities
  • Collaborates with key internal stakeholders to enable higher productivity
  • Works independently with little supervision
  • Knowledge of Scala, Java or Python
  • Knowledge of software development best practices such as code reviews, testing frameworks, maintainability, and readability
  • Expertise with Databricks Delta Lake
  • Knowledge of Structured Query Language (SQL) and Non-Structured Query Language (NoSQL) technologies and fluent in writing, executing, and optimizing SQL queries
  • Knowledge of Big Data technologies and cloud platforms such as Databricks, Apache Spark, Azure Data Factory, Azure Data Explorer, Azure Data Lake, Google BigQuery, Google Dataproc, Google Cloud Data Fusion, Google Dataflow, Google Cloud Composer, Google, Dataprep, Google Dataplex, Google BigLake, Google Vertex
  • Knowledge of event-driven architecture (e.g., Pub-Sub, Kafka, Message Queuing (MQ), Message Queuing Telemetry Transport (MQTT), Advanced Message Queuing Protocol (AMQP), Event Hub, Logstash)
  • Bachelor's Degree in Computer Science, Electrical Development or Software Development
  • Google or Azure Data Development certification

Nice To Haves

  • Any designation for these above would be considered as an asset

Responsibilities

  • Ensure optimal data delivery architecture and processes are consistent throughout ongoing projects
  • Optimize CN’s data architecture to support the next generation of products and data initiatives
  • Build, handle, and optimize data pipelines, moving them effectively into production for key data and analytics consumers
  • Build data and domain event models, implement business rules, and scalable data pipelines
  • Ensure compliance with data governance and security requirements while creating, improving, and operationalizing integrated and reusable data pipelines
  • Enable faster data access, integrate data reuse, and improve time-to-solution for data and analytics initiatives
  • Integrate analytics and data science results with business processes
  • Promote effective data management practices
  • Collaborate with Data Science, Reporting, Analytics and other Development teams to build data pipelines, infrastructure and tooling to support business initiatives
  • Design and develop Exact, Transfer and Load (ETL) pipelines using multiple sources of data in various formats and deploy them to achieve a high-level of reliability, scalability, and security
  • Collaborate with stakeholders and architects to model data landscape and define secure data exchange approaches
  • Meet with stakeholders to identify fit-for-purpose within CN’s existing data ecosystem and deliver options and agile solutions
  • Design and develop processing pipelines that ingest data into Data Hubs
  • Provide day-to-day support and technical expertise to both technical and non-technical teams
  • Participate in building data development expertise and framework
  • Translate business needs into technical requirements
  • Use Agile methodologies and development practices to streamline project delivery aligned with goals, timelines, and budgets and for code reviews and testing to develop and deliver data pipelines
  • Build monitoring and debugging tools to analyze data pipelines
  • Help unify software development and operations seamlessly, efficiently, and cost effectively
  • Improve software quality, automate processes, and accelerate software releases
  • Develop and implement test plans and scripts for various data quality processes
  • Maintain manual and automated test scripts
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service