Staff Engineer Data Platform

CME GroupChicago, IL
Hybrid

About The Position

The Data Platform Engineering team is a collection of highly skilled individuals dedicated to building and scaling the infrastructure that powers our organization's data-driven initiatives. We are building a cutting-edge platform on Google Cloud to handle petabyte-scale data processing, real-time analytics, and machine learning workloads. We champion a DevSecOps culture, and our core principles are centered around automation, quality, and Infrastructure as Code (IaC) to raise the bar on how we manage and deploy data infrastructure. About the Role: As a Data Platform Engineer, you will be responsible for building and maintaining the self-service platforms and frameworks used by our Data Scientists and Engineers. Your work will directly accelerate the development and deployment of batch and streaming data pipelines built with Spark and Flink, and enable seamless, governed access to data in BigQuery. You will be a key contributor to our automation strategy, utilizing tools like Terraform for infrastructure provisioning and Argo Workflows for orchestrating complex data jobs on Google Kubernetes Engine (GKE).If you are passionate about building robust infrastructure for large-scale data processing, cloud-native data architectures, and empowering data practitioners, this job is for you!

Requirements

  • Strong proficiency in a high-level programming language such as Python, Java, or Golang.
  • Deep experience with cloud platforms, specifically Google Cloud (GCP).
  • Hands-on experience with large-scale data processing frameworks like Apache Spark or Apache Flink.
  • Proven hands on experience with containerization (Docker) and orchestration, preferably Kubernetes.
  • Excellent proficiency with version control systems, particularly Git.
  • Strong understanding of Infrastructure as Code (IaC) and GitOps principles.
  • Strong understanding of data warehousing solutions, with hands-on experience in Google BigQuery being a significant advantage.
  • Experience working independently, as well as in a team environment, proactively driving initiatives and tasks to meet delivery timelines.
  • Excellent oral and written communication skills.
  • Experience building and managing complex data pipelines using orchestration tools like Argo Workflows or Airflow .
  • Familiarity with building CI/CD pipelines (e.g., Jenkins, GitLab CI,Argo Workflows) for data applications.

Nice To Haves

  • Proficiency with Infrastructure as Code (IaC) tools, particularly Terraform is a plus

Responsibilities

  • Design, build, and operate the core infrastructure for our data platform on Google Cloud.
  • Develop self-service tools and automation to improve the productivity of Data Engineering and Data Science teams.
  • Partner with application teams to ensure data services are well-architected, secure, and optimized for performance and cost.
  • Serve as a Subject Matter Expert (SME) for core platform components, including our Kubernetes environment (GKE), data processing engines (Spark, Flink), and data warehousing (BigQuery).
  • Mentor junior team members and lead technical initiatives to define the future of our data infrastructure toolset.

Benefits

  • annual target bonus opportunity
  • broad-based equity program
  • comprehensive health coverage
  • retirement package that includes both a 401(k) and an active pension plan
  • highly competitive education reimbursement provisions
  • paid time off
  • mental health benefit
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service