About The Position

gTech’s Product and Tools Operations team (gPTO) leverages deep user, operational, and technical insights to innovate Google's Ads products into customer experiences that are so intuitive (or automated) that they require no support at all. gPTO partners closely with gTech’s Support, Professional Services, Product Management, and Engineering teams to innovate and simplify our Ads products and build the productivity tools ecosystem for gTech users. Global Business and Operations Engineering (GBO Engineering) is an embedded product and engineering team within Google’s ad sales organization. We build consumer-grade internal tooling for sales, marketing, and service teams. Within GBO Engineering the Data Hub team is the center of excellence for Data Engineering across the wider business organization. The Data Hub team owns mission critical infrastructure powering production systems, and owns novel data deep dives and pipelines used in strategic decision-making at the highest levels of Google's leadership. In this role, you will play a key part in building the foundational data infrastructure that powers Google's advertising products. You will be responsible for designing, developing, and maintaining a robust and scalable data warehouse solution that serves the needs of our internal teams and ultimately drives value for our customers. The US base salary range for this full-time position is $130,000-$187,000 + bonus + equity + benefits. Our salary ranges are determined by role, level, and location. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. Your recruiter can share more about the specific salary range for your preferred location during the hiring process. Please note that the compensation details listed in US role postings reflect the base salary only, and do not include bonus, equity, or benefits. Learn more about benefits at Google [https://careers.google.com/benefits/].

Requirements

  • Bachelor's degree or equivalent practical experience.
  • 3 years of experience coding in one or more programming languages.
  • 3 years of experience working with data infrastructure and data models by performing exploratory queries and scripts.
  • 3 years of experience designing data pipelines, and dimensional data modeling for synch and asynch system integration and implementation using internal (e.g., Flume, etc.) and external stacks (DataFlow, Spark, etc.).

Nice To Haves

  • 3 years of experience with statistical methodology and data consumption tools such as business intelligence tools, collabs, jupyter notebooks, Tableau, Power BI, DataStudio, and business intelligence platforms.
  • 3 years of experience partnering with stakeholders (e.g., users, partners, customer), and managing stakeholders/customers.
  • 3 years of experience developing project plans and delivering projects on time within budget and scope.
  • 3 years of enterprise experience leveraging both SQL and another programming language (Java).

Responsibilities

  • Design, build, and maintain Extract, Transform, and Load (ETL) pipelines using SQL, Java, Python, and other technologies to ingest and transform data from various sources.
  • Develop and manage the data warehouse storage layers, leveraging Google BigQuery (equivalent internal technologies) and other technologies.
  • Create and maintain data marts and presentation layers to support reporting and analysis needs.
  • Collaborate with product owners, business stakeholders, and data analysts to understand data requirements and translate them into technical solutions.
  • Contribute to the development and implementation of data governance policies and procedures.

Benefits

  • bonus
  • equity
  • benefits
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service