About The Position

gTech’s Product and Tools Operations team (gPTO) leverages deep user, operational, and technical insights to innovate Google's Ads products into customer experiences that are so intuitive (or automated) that they require no support at all. gPTO partners closely with gTech’s Support, Professional Services, Product Management, and Engineering teams to innovate and simplify our Ads products and build the productivity tools ecosystem for gTech users. In this role, you will support the GASP team which builds technology to protect Geo assets at scale. As a Data Engineer, you will build and manage data pipelines to ensure data sources can be consumed by the Geo Anti-Scraping Program (GASP) for abuse and scraping detection, monitoring and measurement.The Geo team is focused on building the most accurate, comprehensive, and useful maps for our users, through products like Maps, Earth, Street View, Google Maps Platform, and more. Every month, more than a billion people rely on Maps services to explore the world and navigate their daily lives. The Geo team also enables developers to use the power of Google Maps platforms to enhance their apps and websites. As they plot a course for the future of mapping, they are solving computer science problems, designing beautiful and intuitive product experiences, and improving our understanding of the real world.

Requirements

  • Bachelor's degree or equivalent practical experience.
  • 5 years of experience designing, building, and managing production-grade ETL pipelines.
  • Experience writing readable, structured code (e.g., Python, Java) and applying AI/ML libraries/frameworks (e.g., TensorFlow, Vertex AI) within data systems.
  • Experience applying AI/ML techniques to anti-abuse, security, or fraud detection.

Nice To Haves

  • Experience with AI to drive automation for data pipelines and data quality.
  • Experience with data logging, monitoring, and analysis tools.
  • Familiarity with distributed systems.
  • Ability to manage project timelines and deliverables effectively.
  • Understanding of bot detection techniques and adversarial machine learning.
  • Excellent stakeholder engagement skills, with experience working with both internal teams and external vendors.

Responsibilities

  • Architect, build, and maintain data pipelines to ingest, process, and transform logs and signals from various Geo services for scraping detection and analysis.
  • Implement and manage data quality frameworks, leveraging AI and automation to enhance data ingestion, ensure data integrity, and improve the accuracy of anti-scraping models and analytics.
  • Develop and maintain curated datasets, reports, queries, and dashboards to support client users in understanding and mitigating scraping threats.
  • Drive the creation of automated solutions and self-service tools to accelerate data-driven decision-making and improve the efficiency of anti-scraping operations and partner care experiences.
  • Provide operational support for anti-scraping data systems, including performance monitoring, post-launch issue resolution, proactive planning for dependency changes and system migrations, while defining and driving the long-term technical goal and roadmap for a scalable, resilient, and cost-effective anti-scraping data infrastructure for Google Maps.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service