Senior Data Engineer

RBCToronto, ON
Onsite

About The Position

The Senior Data Engineer is responsible for designing, building, and maintaining scalable data pipelines from various sources across RBC to create a unified view for analytics and reporting. You will work with distributed systems, cloud infrastructure, and modern data tooling to transform raw data into actionable business insights while ensuring data quality, governance, and performance at scale.

Requirements

  • Bachelor’s degree in Computer Science, Software Engineering, or a related field, with 3+ years of professional experience as a data or software engineer.
  • Proficiency in Python, Scala, Java and SQL with hands-on experience using modern data tooling (e.g., Hadoop, Spark, Airflow, DBT, etc.).
  • Strong foundation in both data and software engineering, designing and building scalable data pipelines and ML-ready datasets in hybrid environments spanning on-prem infrastructure and public cloud platforms (i.e. AWS).
  • Experience building or enabling AI-driven automation (e.g., agents, workflow orchestration, or decision engines) that reduces manual effort.
  • Experience with DevOps and CI/CD tooling such as GitHub Actions to automate testing, builds, and deployments for data and ML pipelines.
  • Excellent collaboration and communication skills, with the ability to translate complex technical ideas into practical, business-focused solutions.
  • Proven analytical, communication and presentation skills.
  • Exceptional problem-solving skills and ability to conceptualize strategy.
  • Attention to detail, organization, ability to multitask and time management is critical

Nice To Haves

  • Experience in financial services industry
  • Knowledge of agentic AI systems using LLM frameworks (claude-sonnet, gpt), open-source tools, and cloud technologies
  • Experience in working on AWS
  • Experience developing real-time Kafka consumers in java

Responsibilities

  • Design, build, and maintain scalable and efficient data pipelines to collect, process, and transform data from various sources into usable formats.
  • Integrate diverse data sources and formats, including structured and unstructured data, APIs, and streaming data, to create a unified view for analysis and reporting.
  • Identify and implement optimizations to improve data processing performance, scalability, and reliability.
  • Ensuring data quality, reliability, governance, and performance at scale while collecting and generating data assets.
  • Collaborate with other engineers and the business to deliver solutions that meet functional and non-functional requirements and drive business value.
  • Automate workflows to minimize manual intervention and enhance overall productivity.
  • Engage with stakeholders from the business to collect and document their requirements.
  • Ensure the smooth operation of daily production tasks and promptly address any failures.
  • Work in an agile methodology and participate all ritual such as standups and planning

Benefits

  • bonuses
  • flexible benefits
  • competitive compensation
  • commissions
  • stock where applicable
  • Leaders who support your development through coaching and managing opportunities
  • Ability to make a difference and lasting impact
  • Work in a dynamic, collaborative, progressive, and high-performing team
  • A world-class training program in financial services
  • Flexible work/life balance options
  • Opportunities to do challenging work
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service