Associate Data Engineer- GCP

CapgeminiNashville, TN

About The Position

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you’d like, where you’ll be supported and inspired by a collaborative community of colleagues around the world, and where you’ll be able to reimagine what’s possible. Join us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. The Data Engineer – API serves as a key development and technical leadership resource for designing, developing, testing, implementing, documenting, and maintaining NextGen enterprise data solutions on Google Cloud Platform (GCP). This role works closely with cross‑functional data teams in a highly matrixed environment and plays a critical role in shaping scalable, cloud-native, and AI-enabled data platforms. Given the rapidly evolving GCP ecosystem, the ideal candidate remains current with emerging technologies and consistently applies new innovations to drive business value. This is a hands-on technical leadership role focused on API engineering best practices, test-driven development, CI/CD, and automated deployments.

Requirements

  • Strong understanding of GCP data architecture, design patterns, and best practices.
  • 2+ years of hands-on experience with GCP, including several of the following: Postman, Dynatrace, Cloud Run, GKE, Cloud Functions, Bigtable, Cloud SQL, Cloud Spanner, BigQuery, Cloud Logging, CI/CD pipelines, Vertex AI, NLP services, GitHub.
  • 4+ years of hands-on experience in multiple of the following areas: API development and integration, Apigee, Python FastAPI framework, Spark Streaming, Kafka, Data formats: SQL, JSON, Avro, Parquet, Programming languages: Java, Python, or Scala.

Nice To Haves

  • Google Cloud Professional Data Engineer (preferred but not required)

Responsibilities

  • Collaborate with data engineers, data architects, data scientists, and internal stakeholders to understand business and product requirements, and design scalable data platforms and pipelines.
  • Design, develop, and support enterprise-grade APIs that accelerate the time from idea to insight.
  • Build and support a GCP-based data ecosystem for enterprise-wide analytics, supporting structured, semi-structured, and unstructured data.
  • Implement automated workflows to reduce manual and operational effort while defining and enforcing SLAs for timely data delivery.
  • Enable a self-service data architecture, including query exploration, dashboards, data catalogs, and advanced data discovery capabilities.
  • Set technical direction across groups of applications and related technologies, ensuring solutions meet business, architectural, and operational constraints.
  • Promote and enforce API best practices, standards, governance, and documentation.
  • Produce high-quality, modular, reusable, and maintainable code that exemplifies engineering best practices.
  • Champion test-driven development, continuous integration, and automated deployment pipelines.
  • Provide technical mentorship and coaching to team members on complex data initiatives and Agile delivery practices.
  • Foster a collaborative team environment with strong communication and a focus on collective success.

Benefits

  • Paid time off based on employee grade (A-F), defined by policy: Vacation: 12-25 days, depending on grade, Company paid holidays, Personal Days, Sick Leave
  • Medical, dental, and vision coverage (or provincial healthcare coordination in Canada)
  • Retirement savings plans (e.g., 401(k) in the U.S., RRSP in Canada)
  • Life and disability insurance
  • Employee assistance programs
  • Other benefits as provided by local policy and eligibility
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service