Hitachi Digital Services-posted 3 months ago
Full-time • Senior
Remote

We're Hitachi Digital Services, a global digital solutions and transformation business with a bold vision of our world's potential. We're people-centric and here to power good. Every day, we future-proof urban spaces, conserve natural resources, protect rainforests, and lives. This is a world where innovation, technology, and deep expertise come together to take our company and customers from what's now to what's next. We make it happen through the power of acceleration. Imagine the sheer breadth of talent it takes to bring a better tomorrow closer to today. We don't expect you to 'fit' every requirement - your life experience, character, perspective, and passion for achieving great things in the world are equally as important to us.

  • Architect and implement scalable, fault-tolerant, and low-latency data pipelines for real-time and batch processing.
  • Design event-driven systems using Kafka, Flink, and Spark Structured Streaming.
  • Define data models, schemas, and integration patterns for IoT and telemetry data.
  • Lead the technical direction of the data engineering team, ensuring best practices in streaming architecture and cloud-native design.
  • Provide hands-on guidance in coding, debugging, and performance tuning of streaming applications.
  • Collaborate with product, engineering, and DevOps teams to align data architecture with business needs.
  • Build and deploy real-time data processing solutions using Apache Flink and Spark Structured Streaming.
  • Integrate messaging systems (Kafka, Kinesis, RabbitMQ, etc.) with cloud-native services on AWS.
  • Ensure high availability, scalability, and resilience of data platforms supporting IoT and telemetry use cases.
  • Continuously evaluate and improve system performance, latency, and throughput.
  • Explore emerging technologies in stream processing, edge computing, and cloud-native data platforms.
  • Implement DevOps, CI/CD, and infrastructure-as-code practices.
  • Apache Flink (real-time stream processing)
  • Apache Spark Structured Streaming
  • Apache Kafka or equivalent messaging queues (e.g., RabbitMQ, AWS Kinesis)
  • Event-driven architecture design
  • AWS services: S3, Lambda, Kinesis, EMR, Glue, Redshift
  • Strong programming skills in Pyspark, Java, or Python
  • Experience with containerization (OpenShift)
  • Familiarity with IoT protocols and resilient data ingestion patterns
  • Knowledge of data lake and lakehouse architectures (Iceberg) S3 storage
  • Experience in building large-scale IoT platforms or telemetry systems.
  • AWS Certified Data Analytics or Solutions Architect.
  • Industry-leading benefits, support, and services that look after your holistic health and wellbeing.
  • Flexible arrangements that work for you (role and location dependent).
  • A sense of belonging, autonomy, freedom, and ownership.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service