GCP Solution Architect (P-153)

Smash CRNew York, NY
4dHybrid

About The Position

You will bridge business intake and technical execution by designing end-to-end data platform solutions on Google Cloud Platform (GCP). This role translates product and business requirements into scalable, secure, and production-ready architectures that enable delivery teams to execute with clarity and confidence.

Requirements

  • 5–7 years of experience in enterprise-scale data platform architecture.
  • 2–3+ years of hands-on experience with GCP data platforms.
  • Proven experience leading at least two end-to-end GCP data pipeline implementations or migrations.
  • Strong understanding of GCP landing zones, IAM, networking, and security models.
  • Expertise in enterprise ETL and data engineering from diverse data sources into GCP.
  • Deep knowledge of data ecosystem concepts (pipelines, integration patterns, lineage, metadata, data quality, monitoring).
  • Ability to architect holistically across governance, reliability, performance, scalability, and security.
  • Experience working in complex enterprise or consulting environments.
  • Strong communication skills to bridge technical and non-technical stakeholders.

Nice To Haves

  • Experience working directly with Google Cloud or prior Google employment.
  • Experience building data platforms from zero to production-ready environments.
  • Multi-cloud knowledge (AWS or Azure) to translate architectural patterns.
  • Domain-specific data solution experience.

Responsibilities

  • Translate business and product requirements into comprehensive GCP-based technical architectures.
  • Design enterprise-scale data platforms, including ingestion, transformation, storage, and analytics layers.
  • Lead end-to-end data pipeline migrations into GCP environments.
  • Architect landing zones, IAM structures, networking, and security models in GCP.
  • Define governance frameworks including lineage, metadata, data quality, and monitoring standards.
  • Design reliable, scalable, and performance-optimized data ecosystems.
  • Partner with stakeholders to clarify scope, constraints, and trade-offs.
  • Provide technical guidance to delivery teams for implementation readiness.
  • Evaluate integration patterns and enterprise data flows across multiple systems.
  • Contribute to best practices for enterprise ETL and data engineering modernization.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service