West Monroe-posted about 12 hours ago
Full-time • Mid Level
New York, NY
1,001-5,000 employees

As a Data Architect, you will lead the design and delivery of cloud-native data architectures and solutions for our clients. You will work closely with business stakeholders, data engineers, and developers to build robust data platforms that enable advanced analytics, machine learning, and real-time data processing. This role requires a mix of technical expertise, consulting skills, and leadership to drive successful outcomes in data-driven projects. At West Monroe, we work with you. We’re a global business and technology consulting firm passionate about creating measurable value for our clients, delivering real-world solutions. The combination of business and technology is not new, but how we bring them together is unique. We’re fluent in both. We know that technology alone is not the answer, but how we apply it is. We rely on data to constantly adapt and solve new challenges. Actions that work today with outcomes that generate value for years to come. At West Monroe, we zero in on the heart of the opportunity, getting to results faster and preparing people for what’s next. You’ll feel the difference in how we work. We show up personally. We’re right there in the room with you, co-creating through the challenges. With West Monroe, collaboration isn’t a lofty promise, but a daily action. We work together with you to turn vision into clear action with lasting impact.

  • Design and implement scalable, secure, and high-performance data architectures on Google Cloud Platform (GCP).
  • Define and implement data lake and data warehouse architectures using GCP services such as BigQuery, Cloud Storage, Dataplex, and Dataform.
  • Develop strategies for data migration to GCP from on-premises or other cloud platforms, ensuring minimal disruption and optimal performance.
  • Architect and oversee the implementation of batch and streaming data pipelines using tools such as Apache Beam, Dataflow, Dataproc, and Cloud Composer.
  • Guide the development of data models optimized for performance, scalability, and cost-efficiency in BigQuery and other GCP services.
  • Define and implement best practices for data governance, lineage, security, and compliance in GCP environments, leveraging tools like Cloud DLP, IAM, and Policy Analyzer.
  • Partner with stakeholders to establish real-time analytics pipelines using services like Pub/Sub, Dataflow, and BigQuery streaming.
  • Provide expertise in data partitioning, clustering, and query optimization to reduce costs and improve performance.
  • Lead the adoption of serverless solutions and modern data engineering practices, including CI/CD pipelines for data workflows using tools like Cloud Build, GitHub Actions, or Terraform.
  • Evaluate and recommend GCP-native AI/ML tools such as Vertex AI and AutoML for advanced analytics and predictive modeling.
  • Serve as a trusted advisor to clients, presenting technical solutions, architectural roadmaps, and cost optimization strategies.
  • Conduct workshops, proof-of-concepts (POCs), and training sessions to help clients adopt GCP technologies.
  • Lead end-to-end implementation of data solutions, including ETL/ELT pipelines, data lakes, and data warehouses, ensuring delivery within scope, budget, and timeline.
  • Troubleshoot and resolve complex issues related to GCP infrastructure, data pipelines, and integrations.
  • Monitor and optimize the performance and cost of GCP data systems, leveraging tools like Cloud Monitoring, Cloud Logging, and BigQuery BI Engine.
  • 7+ years of experience in data architecture, data engineering, or related roles, with at least 3 years of hands-on experience in Google Cloud Platform (GCP).
  • Proven track record of delivering data lake, data warehouse, and real-time analytics solutions on GCP.
  • Expertise in GCP services including BigQuery, Cloud Storage, Dataproc, Dataflow, Pub/Sub, and Cloud SQL/Spanner.
  • Proficiency in designing and implementing ETL/ELT pipelines using Cloud Data Fusion, Apache Beam, or Cloud Composer.
  • Experience with streaming data pipelines using Pub/Sub and Dataflow.
  • Familiarity with Vertex AI, AutoML, and AI Platform Pipelines for machine learning workflows.
  • Strong understanding of IAM roles, service accounts, VPC Service Controls, and encryption best practices.
  • Proficiency in SQL for data modeling, querying, and optimization in BigQuery.
  • Strong programming skills in Python or Java, with experience in building reusable data pipelines and frameworks.
  • Experience with Terraform or Deployment Manager for infrastructure as code (IaC) in GCP environments.
  • Familiarity with CI/CD pipelines for data workflows using Cloud Build or other DevOps tools.
  • Proven ability to lead technical teams and deliver complex projects.
  • Excellent communication and stakeholder management skills, with the ability to explain technical concepts to non-technical audiences.
  • GCP certifications such as Professional Data Engineer or Professional Cloud Architect are preferred.
  • Experience with data mesh or data fabric architectures is a plus.
  • Knowledge of multi-cloud and hybrid cloud strategies is a plus.
  • Familiarity with other cloud platforms such as AWS or Azure is a plus.
  • Hands-on experience with data observability tools such as Monte Carlo or Databand is a plus.
  • Employees (and their families) are covered by medical, dental, vision, and basic life insurance.
  • Employees are able to enroll in our company’s 401k plan, purchase shares from our employee stock ownership program and be eligible to receive annual bonuses.
  • Employees will also receive unlimited flexible time off and ten paid holidays throughout the calendar year.
  • Eligibility for ten weeks of paid parental leave will also be available upon hire date.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service