Senior Analyst, Content Adversarial Red Team

GoogleWashington D.C., DC
8h$160,000 - $237,000

About The Position

Trust and Safety team members are tasked with identifying and taking on the biggest problems that challenge the safety and integrity of our products. They use technical know-how, excellent problem-solving skills, user insights, and proactive communication to protect users and our partners from abuse across Google products like Search, Maps, Gmail, and Google Ads. On this team, you're a big-picture thinker and strategic team-player with a passion for doing what’s right. You work globally and cross-functionally with Google engineers and product managers to identify and fight abuse and fraud cases at Google speed - with urgency. And you take pride in knowing that every day you are working hard to promote trust in Google and ensuring the highest levels of user safety. The Content Adversarial Red Team (CART) within Trust and Safety conducts unstructured adversarial testing of Google’s premier generative AI products to uncover emerging content risks not identified in structured evaluations. CART works alongside product, policy, and enforcement teams to build the safest possible experiences for Google users. In this role, you will develop and drive the team’s strategic plans while acting as a key advisor to executive leadership, leveraging cross-functional influence to advance safety initiatives. As a member of the team, you will mentor analysts and foster a culture of continuous learning by sharing your deep expertise in adversarial techniques. Additionally, you will represent Google’s AI safety efforts in external forums, collaborating with industry partners to develop best practices for responsible AI and solidifying our position as a thought leader in the field. At Google we work hard to earn our users’ trust every day. Trust and Safety is Google’s team of abuse fighting and user trust experts working daily to make the internet a safer place. We partner with teams across Google to deliver bold solutions in abuse areas such as malware, spam and account hijacking. A team of Analysts, Policy Specialists, Engineers, and Program Managers, we work to reduce risk and fight abuse across all of Google’s products, protecting our users, advertisers, and publishers across the globe in over 40 languages.

Requirements

  • Bachelor's degree or equivalent practical experience.
  • 10 years of experience in data analytics, trust and safety, policy, cybersecurity, business strategy, or a related field.
  • Experience in Artificial Intelligence or Machine Learning.

Nice To Haves

  • Master's degree or PhD in a relevant field.
  • 3 years of experience in red teaming, vulnerability research or penetration testing.
  • Experience working with engineering and product teams to create tools, solutions, or automation to improve user safety.
  • Experience with machine learning.
  • Experience in SQL, building dashboards, data collection/transformation, visualization/dashboards, or experience in a scripting/programming language (e.g., Python).
  • Excellent problem-solving and critical thinking skills with attention to detail in an ever-changing environment.

Responsibilities

  • Lead and guide the team's efforts in identifying and analyzing high-complexity content risks, with a special focus on the safety of users under 18 and influence cross-functional teams, including Product, Engineering, Research, and Policy, to drive the implementation of safety initiatives.
  • Develop and deploy tailored and red teaming exercises that identify emerging, unanticipated, or unknown threats.
  • Drive the creation and refinement of net new red teaming methodologies, strategies and tactics to help build the U18 red teaming program and ensure coherence and consistency across all testing modalities.
  • Design, develop, and oversee the execution of innovative and red teaming strategies to uncover content abuse risks.
  • Act as a key advisor to executive leadership on content safety issues, providing actionable insights and recommendations.
  • This role will be exposed to graphic, controversial, or upsetting content.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service