Junior Data Engineer - Integration

Ocean Infinity
$30,000

About The Position

Worker Type: Employee Application End Date: 08-05-2026 We are using and creating technology to transform operations at sea to enable people and the planet to thrive. We are open-minded and fearless in our approach to innovation and don't believe in boundaries. We challenge everything and have massive ambitions to drag aging industries into the tech era. We take safety, equality and education very seriously, and our responsibilities don't stop at our front door. Our business is built on the belief that there's definitely a more environmentally responsible way to operate at sea. We employ people who share our core values. We expect our people to be courageous, trustworthy, and conscientious, driven by a desire to do the right thing. We strive for excellence, work collaboratively, and are genuinely excited by our work. We offer opportunities for our people to develop beyond their role and span a multitude of disciplines. These are open to all, regardless of background and experience level. Working with us means being part of a team that is harnessing technology and creativity to disrupt a traditional industry. We are not your average workplace. Ocean Infinity is seeking an enthusiastic and motivated Junior Data Engineer - Integration to join our team. The successful candidate will focus on building and automating data pipelines that transform manual workflows into scalable, reliable data solutions. You will work closely with senior team members, data engineers, and product teams to develop data integrations and transformation pipelines that power analytics and operational systems across the company.

Requirements

  • A degree in Computer Science, Mathematics, or a related field, or equivalent practical experience.
  • 0-2 years of experience in Data Engineering, Backend Engineering, Software Development, or related internship/academic project experience.
  • Solid programming foundation in Python and working knowledge of SQL.
  • Basic understanding of data pipelines, APIs, or backend services.
  • Familiarity with version control systems (Git) and collaborative development workflows.
  • Eagerness to learn about workflow orchestration tools (Airflow, Prefect, Flyte) and ETL/ELT frameworks.
  • Understanding of fundamental data structures and algorithms.
  • Good communication skills and willingness to learn from data engineering teams and business stakeholders.
  • Enthusiasm for working with data, building automated pipelines, and solving practical data transformation problems.

Nice To Haves

  • Academic or personal projects involving data processing, ETL pipelines, or data transformation.
  • Exposure to cloud platforms (Azure, AWS, GCP) and data lake architectures.
  • Familiarity with SQL or NoSQL databases (Postgres, MongoDB, Redis).
  • Basic understanding of Delta Lake or lakehouse concepts.
  • Knowledge of data modeling and schema design principles.
  • Experience with CI/CD concepts or tools.
  • Basic understanding of containerization concepts (Docker).
  • Exposure to data quality frameworks or testing approaches.

Responsibilities

  • Build and maintain data pipelines that automate manual workflows, transforming raw data into curated, high-quality datasets under guidance from senior team members.
  • Assist in developing data integrations between the lakehouse and downstream products, analytics tools, or operational systems.
  • Support the implementation of Bronze → Silver → Gold transformation pipelines using modern orchestration tools.
  • Contribute to data quality efforts including implementing validation checks, monitoring pipeline health, and basic troubleshooting.
  • Help develop backend data services and APIs that expose curated datasets to internal consumers and business applications.
  • Work with Data Engineers, product teams, and business stakeholders to understand data requirements and help implement transformation logic.
  • Document data pipelines, transformation logic, and data flows, ensuring knowledge sharing across the team.
  • Learn and apply best practices in data engineering, ETL/ELT patterns, and cloud data technologies.

Benefits

  • The salary varies for this position as we are recruiting in multiple regional locations and job grades. The salary process is based on skills, abilities, and experience required.
  • At Ocean Infinity, we believe in creating equal opportunities for all, celebrating each and everyone’s differences.
  • We are driven by transforming the industry, through our technology, thoughts, behaviours and actions.
  • Being inclusive and respectful to all is fundamental to who we are.
  • It is the right thing to do and enables innovation and creativity to thrive.
  • There is more work to be done, and we know that we aren’t perfect, but our commitment to these values is unwavering.
  • They are central to our mission and the impact we have on the industry, meaning, we cannot live without them.
  • Simply put, our mission is to use innovative technology, to transform operations at sea, to enable people and the planet to thrive.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service