Article Lead Image

Photo by jacinta lluch valero/Flickr (CC BY-SA 2.0)

Google accidentally leaked its ‘right to be forgotten’ data

Less than 5 percent of requests relate to public figures.

 

Dell Cameron

Tech

Posted on Jul 14, 2015   Updated on May 28, 2021, 9:01 am CDT

An apparent bug in the source code of Google‘s transparency reports has revealed new details about the kinds of people who ask the company to remove their personal information from search results.

The public fuss over the so-called “right to be forgotten” would appear to misrepresent how the system is actually being used. In reviewing the data, journalists at the Guardian found that an overwhelming majority of requests came from regular people worried about their private and personal information being online—not celebrities seeking to scrub embarrassing incidents from the public record.

The accidentally revealed data come from Europe, where Google is required to honor reasonable right-to-be-forgotten takedown requests due to a 2014 court ruling. A prominent U.S. consumer group wants Google to do the same thing in its home country.

Only a small percentage of the requests came from celebrities and other public figures. The media’s coverage of right-to-be-forgotten debates often focuses on the possibility that public figures could convince Google to delete evidence of their newsworthy misconduct.

“Less than 5% of nearly 220,000 individual requests made to Google to selectively remove links to online information concern criminals, politicians and high-profile public figures,” the Guardian reported, “with more than 95% of requests coming from everyday members of the public.”

The reporters reviewed 218,320 right-to-be-forgotten requests that Google received in the ten months beginning in May 2014. Google granted 46 percent, or 101,461, of those requests, and 99,569 of those concerned “personal or private information.”

Google honored about half of the delisting requests and rejected roughly a third; the rest were still pending at the time of analysis.

“We’ve always aimed to be as transparent as possible about our right to be forgotten decisions,” Google told the Guardian on Tuesday. “The data the Guardian found in our Transparency Report’s source code does of course come from Google, but it was part of a test to figure out how we could best categorise requests. We discontinued that test in March because the data was not reliable enough for publication. We are however currently working on ways to improve our transparency reporting.”

Photo via jacinta lluch valero/Flickr (CC BY SA 2.0)

Share this article
*First Published: Jul 14, 2015, 1:56 pm CDT