Hitachi-posted about 2 months ago
$125,000 - $135,000/Yr
Full-time • Mid Level
Remote • Charlotte, NC
5,001-10,000 employees
Professional, Scientific, and Technical Services

We are looking for a hands-on Senior DevOps Engineer with strong expertise in building and managing scalable, secure, and automated cloud-based data platforms. The ideal candidate should have a solid background in data engineering coupled with deep experience in DevOps and infrastructure automation. "GlobalLogic estimates the starting pay range for this role to be performed remotely with in USA to be $125K to $135K and reflects base salary only and does not include additional performance-linked variable compensation, benefits etc that may be applicable for the role. This pay range is provided as a good faith estimate and the amount offered may be higher or lower. GlobalLogic takes many factors into consideration in making an offer, including candidate qualifications, work experience, operational needs, travel and onsite requirements, internal peer equity, prevailing wage, responsibilities, and other market and business considerations.

  • Design, build, and automate data pipelines and orchestration workflows using GitHub Actions for CI/CD of data jobs and infrastructure components.
  • Architect and manage Azure IaaS resources supporting data workloads (VMs, Storage, Networking, Security, Monitoring).
  • Deploy and operate Kubernetes clusters for hosting containerised data services and workloads (Spark, Airflow, Kafka, etc.).
  • Implement Infrastructure as Code (IaC) with Terraform for provisioning data environments and ensuring consistency across stages.
  • Collaborate with data scientists, analysts, and platform teams to optimize data flow, performance, and scalability.
  • Enforce governance, reliability, and automation best practices across the data ecosystem.
  • Strong hands-on experience with GitHub Actions, Azure IaaS, Kubernetes, and Terraform (must-have).
  • Proficiency in SQL and one programming language (Python preferred).
  • Understanding of data lake, data warehouse, and streaming architectures.
  • Exposure to monitoring, logging, and cost optimization in Azure environments.
  • Strong problem-solving ability and an automation-first approach to data engineering.
  • Culture of caring.
  • Learning and development.
  • Interesting & meaningful work.
  • Balance and flexibility.
  • High-trust organization.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service