Data Developer

iTalent PLUSMalta, MT
2dHybrid

About The Position

Our client is seeking a pragmatic, production-focused professional to join their Data team. This role is ideal for someone with strong hands-on experience in building, operating, and evolving modern data platforms within a production environment.

Requirements

  • Bachelor’s degree in Computer Science or an equivalent qualification.
  • A minimum of 3 years’ experience with one or more object-oriented programming languages, such as Python, Java, or Go.
  • Experience with ClickHouse or similar columnar database technologies, with a strong understanding of data warehousing concepts, dimensional modelling, and data lake architectures. Deep knowledge of ClickHouse internals is considered a strong advantage.
  • Solid experience with RDBMS technologies (e.g. Aurora, MariaDB, MySQL), including strong SQL scripting and optimisation skills.
  • Experience with streaming architectures and technologies such as Kafka, Kafka Connect, and Kafka Streams.
  • Hands-on experience with Apache Airflow for data orchestration and pipeline management.
  • Exposure to Prometheus and experience building or working with Grafana dashboards.
  • Familiarity with microservice orchestration and containerisation tools such as Docker and Kubernetes.
  • Experience developing cloud-native applications on AWS, including services such as EC2, Lambda, S3, Serverless, MWAA, and MSK.
  • A strong DataOps mindset, with exposure to hands-on on-premise deployments.
  • A functional programming-oriented approach to real-time streaming architectures.
  • Familiarity with distributed data processing engines such as Spark, Flink, Dask, or Presto/Trino.
  • Proficiency in Unix scripting.
  • Fluency in English, both written and spoken.

Responsibilities

  • Design, develop, and support data pipelines within a hybrid data platform, incorporating both batch and real-time data processing.
  • Monitor, troubleshoot, and resolve production issues across data pipelines, ensuring high availability, reliability, and performance.
  • Take a hands-on approach to Data Governance, including SLA metric capture, anomaly detection, and reporting to key stakeholders, with a strong focus on maintaining optimal data quality.
  • Own and maintain several critical data platform infrastructure components.
  • Research, evaluate, and prototype new technologies, particularly in support of the organisation’s AI/ML initiatives.
  • Actively contribute to design, scoping, and architectural discussions for new data solutions and platform features.
  • Translate business requirements and product objectives into scalable, efficient, and robust data solutions.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service