GCP Data Engineer

QodeMahwah, NJ
3d

About The Position

We are looking for a skilled GCP Data Engineer with strong expertise in BigQuery, Dataflow, Cloud Composer, Python, and SQL to build and optimize scalable data pipelines on Google Cloud. The ideal candidate will have hands-on experience in ETL migration from legacy systems and exposure to AI-native engineering tools to accelerate development and improve productivity.

Requirements

  • Strong hands-on experience in Google Cloud Platform (GCP) – BigQuery, Dataflow, Cloud Composer
  • Proficiency in Python for data processing
  • Advanced knowledge of SQL (joins, window functions, performance tuning)
  • Experience in ETL/ELT pipeline development and migration to cloud
  • Understanding of data warehousing and data modeling concepts
  • Experience working with large-scale distributed data systems
  • Overall 6-15+ years with 3-5 years of relevant work experience in GCP Services
  • B.Tech., M.Tech. or MCA degree from a reputed university

Nice To Haves

  • Knowledge in Pyspark/Dataproc
  • Knowledge in Linux scripting
  • Knowledge on Github, Jenkins, Jira, etc.

Responsibilities

  • Develop, and maintain scalable data pipelines using GCP services
  • Build and optimize ETL/ELT workflows using Dataflow and BigQuery
  • Orchestrate workflows using Cloud Composer (Apache Airflow)
  • Perform data migration from legacy systems (e.g., Teradata, on-prem databases) to GCP
  • Develop reusable and efficient Python-based data processing frameworks
  • Write optimized and complex SQL queries for data transformation and analytics
  • Leverage AI-native engineering tools (e.g., code assistants, automated testing, query optimization tools) to improve engineering throughput
  • Ensure data quality, validation, and governance compliance
  • Monitor and troubleshoot data pipelines and production issues
  • Optimize pipelines for performance, scalability, and cost efficiency
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service