Senior GCP Data Engineer

CapgeminiAtlanta, GA
1dHybrid

About The Position

Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. This job is based in Atlanta, GA and Nashville, TN requiring hybrid work participation. Looking for a GCP Data Engineer with strong Data Engineering experience to build and maintain complex technology initiatives. Your role will involve hands-on development and implementation of data platforms on Google Cloud Platform. Your roleBuild, develop, and maintain end-to-end data pipelines (ETL/ELT) on Google Cloud Platform.Implement data ingestion, processing, storage, and analytics using BigQuery, Dataflow, Pub/Sub, Cloud Storage, Dataproc, and Composer.Implement modern data lakehouse and data warehouse architectures aligned with analytics and AI use cases.Ensure data quality, validation, and adherence to data governance and security standards.Develop infrastructure as code using Terraform or equivalent tools.Collaborate with architects and platform teams on CI/CD, DataOps, and monitoring strategies.Apply GCP security best practices, including IAM, networking, encryption, and compliance controls.Partner with business and technology stakeholders to translate technical designs into functional, efficient code.Act as a technical developer on cloud data modernization and GCP best practices.Support solution implementation, including testing, debugging, and production support.

Requirements

  • 5+ years of overall IT experience with 3+ years specifically in cloud data engineering.
  • Proven experience building and deploying GCP-based data pipelines.
  • Background with large enterprise data programs across industries.
  • Deep expertise in Google Cloud Platform, including BigQuery, Cloud Storage, Dataflow (Apache Beam), Pub/Sub, Dataproc (Spark), and Composer (Airflow).
  • Strong knowledge of data modeling, ETL/ELT, streaming architectures, and SQL.
  • Experience with CI/CD, Terraform, and DataOps practices.
  • Google Professional Data Engineer Certification.
  • Strong communication and stakeholder management abilities.
  • Ability to explain technical concepts to technical and non-technical audiences.
  • Experience working in global, distributed teams.
  • Passion for innovation, engineering excellence

Nice To Haves

  • Exposure to AI/ML enablement on GCP (Vertex AI is a plus).

Responsibilities

  • Build, develop, and maintain end-to-end data pipelines (ETL/ELT) on Google Cloud Platform.
  • Implement data ingestion, processing, storage, and analytics using BigQuery, Dataflow, Pub/Sub, Cloud Storage, Dataproc, and Composer.
  • Implement modern data lakehouse and data warehouse architectures aligned with analytics and AI use cases.
  • Ensure data quality, validation, and adherence to data governance and security standards.
  • Develop infrastructure as code using Terraform or equivalent tools.
  • Collaborate with architects and platform teams on CI/CD, DataOps, and monitoring strategies.
  • Apply GCP security best practices, including IAM, networking, encryption, and compliance controls.
  • Partner with business and technology stakeholders to translate technical designs into functional, efficient code.
  • Act as a technical developer on cloud data modernization and GCP best practices.
  • Support solution implementation, including testing, debugging, and production support.

Benefits

  • Paid time off based on employee grade (A-F), defined by policy: Vacation: 12-25 days, depending on grade, Company paid holidays, Personal Days, Sick Leave
  • Medical, dental, and vision coverage (or provincial healthcare coordination in Canada)
  • Retirement savings plans (e.g., 401(k) in the U.S., RRSP in Canada)
  • Life and disability insurance
  • Employee assistance programs
  • Other benefits as provided by local policy and eligibility
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service