Google has introduced a new tool to help users remove non-consensual explicit images from its Search results. The tool aims to facilitate the removal of such images from indexed content.
The removal process begins with users selecting an image and clicking on the three dots associated with it. They can then choose “remove result,” followed by “It shows a sexual image of me.” Other available options include identifying an image of a person under 18 or an image containing personal information. If the sexual image option is chosen, users will be prompted to indicate whether the image is real or an AI-generated deepfake. The system also supports submitting multiple photographs simultaneously.
Upon submission of a request, Google states that users will immediately receive links to organizations offering emotional and legal support. Users can also elect to activate safeguards that filter out similar results in subsequent searches. This feature is expected to be available in most countries imminently.
Users can monitor the status of their removal requests through Google’s “Results about you” hub. Utilization of this tool requires users to provide personal contact information and government identification numbers. The “Results about you” hub previously allowed users to track personal information appearing in Search; it will now also search for social security numbers, driver’s license details, and passport information. Google intends to notify users if this information is found in Search results, enabling them to proceed with removal actions.
Updates to the “Results about you” hub are scheduled for rollout to users in the United States in the near future. This development coincides with Google discontinuing its dark web reports, which previously notified users of their names, phone numbers, or email addresses appearing online, often due to data breaches. Google indicated that the dark web reports did not effectively guide users in taking corrective actions, a function the new features are intended to fulfill.




