About The Position

This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Big Data Engineer with Google Cloud (GCP) in the United States. As a Big Data Engineer, you will play a pivotal role in architecting and implementing scalable data solutions on Google Cloud Platform. You will lead the migration of on-premises data systems to the cloud, build enterprise-level data pipelines, and optimize big data workloads for performance and cost-efficiency. This role offers the opportunity to work with cutting-edge cloud technologies such as BigQuery, Cloud Dataflow, Cloud Composer, and Kubernetes Engine, while influencing enterprise-wide data strategies. You will collaborate with cross-functional teams, providing technical guidance and ensuring robust, secure, and high-performing data infrastructure. The position combines technical depth with leadership, allowing you to shape cloud transformation initiatives and deliver impactful solutions at scale.

Requirements

  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • Google Cloud Certified Professional Cloud Architect certification required.
  • 8+ years of experience in data engineering, with 3-5+ years on GCP and deep hands-on experience designing and deploying cloud data solutions.
  • Expertise in ETL and Big Data tools such as BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Pub/Sub, Cloud Composer, and Google Data Studio.
  • Experience with NoSQL databases, search technologies (Lucene, Elasticsearch), and relational databases (Cloud Spanner, Cloud SQL).
  • Strong software development background, with knowledge of SDLC, DevOps, CI/CD, and Agile methodologies.
  • Proficiency in data visualization tools such as Kibana, Grafana, Tableau.
  • Excellent communication and leadership skills, with experience advising stakeholders and collaborating with cross-functional teams.

Nice To Haves

  • Knowledge in GCP storage lifecycle management.
  • BigQuery slots management.
  • Cost optimization.
  • Hadoop ecosystem tools (HDFS, Hive, Spark, Kafka, NiFi, Oozie, Splunk).

Responsibilities

  • Architect, design, and implement data pipelines and enterprise infrastructure on Google Cloud Platform (GCP).
  • Lead cloud transformation and migration projects, including strategy, design, and implementation of private and public cloud solutions.
  • Utilize GCP services such as BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Pub/Sub, Cloud Composer, and Cloud Storage to build scalable data platforms.
  • Design and manage relational and NoSQL databases including Cloud Spanner, Cloud SQL, Cloud Bigtable, and Cloud Firestore.
  • Implement advanced analytics, AI, and machine learning pipelines to support business intelligence and data-driven decision-making.
  • Monitor and optimize cloud workloads for cost, performance, and security efficiency.
  • Provide technical guidance and subject matter expertise to cross-functional teams and senior management.
  • Ensure proper documentation, best practices, and continuous improvement in cloud architecture and governance.

Benefits

  • Competitive long-term contract compensation.
  • Remote work flexibility (location-dependent within the U.S.).
  • Opportunity to lead large-scale cloud migration and data transformation projects.
  • Exposure to cutting-edge GCP technologies and enterprise-scale big data infrastructure.
  • Collaborative environment with mentorship opportunities and influence on enterprise architecture.
  • Professional growth through complex, high-impact data engineering initiatives.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service