DataOps Engineer

Paymentology
6hRemote

About The Position

At Paymentology , we’re redefining what’s possible in the payments space. As the first truly global issuer-processor, we give banks and fintechs the technology and talent to launch and manage Mastercard and Visa cards at scale across more than 60 countries. Our advanced, multi-cloud platform delivers real-time data, unmatched scalability, and the flexibility of shared or dedicated processing instances. It’s this global reach and innovation that sets us apart. We’re looking for a DataOps Engineer to join our Data Engineering team and help build a modern data platform from the ground up. This is a greenfield opportunity focused on infrastructure, automation, and observability, playing a critical role in enabling reliable, scalable, and secure data systems. You’ll work closely with data engineers and senior technical stakeholders to design, implement, and operate the foundations of our data stack. This role is ideal for a mid-level engineer with strong DevOps fundamentals who is eager to deepen their expertise in data platforms, cloud infrastructure, and observability within a high-impact, global fintech environment.

Requirements

  • 3-5 years of hands-on experience in DevOps, Platform Engineering, or DataOps roles.
  • Experience supporting or contributing to data platforms or data infrastructure projects.
  • Hands-on proficiency with Infrastructure as Code, particularly Terraform.
  • Experience working with AWS or GCP and common cloud architecture patterns.
  • Practical experience or strong understanding of Kubernetes and containerised workloads.
  • Familiarity with observability tooling across monitoring, logging, metrics, and alerting.
  • Strong scripting skills in Python, Bash, or GoLang to automate operational processes.
  • Excellent problem-solving skills and the ability to work effectively in a collaborative, fully remote environment.
  • A strong inclination to develop DataOps and MLOps knowledge and capabilities.

Nice To Haves

  • Exposure to modern data engineering tools such as dbt, Airflow, Apache Spark, or similar technologies is an advantage.
  • Exposure to modern data engineering tools such as dbt, Airflow, Apache Spark, or similar technologies is an advantage.

Responsibilities

  • Design and implement cloud infrastructure for a modern data platform using Infrastructure as Code, with a strong focus on scalability, security, and reliability.
  • Build and maintain CI/CD pipelines that support data engineering workflows and infrastructure deployments.
  • Implement and operate observability solutions including monitoring, logging, metrics, and alerting to ensure platform reliability and fast incident response.
  • Collaborate closely with data engineers to translate platform and workflow requirements into robust infrastructure solutions.
  • Apply best practices for availability, disaster recovery, and cost efficiency, while documenting infrastructure patterns and operational procedures.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service