Facebook’s autocomplete feature suggested child abuse videos

privateidentity/Flickr (CC-BY)

The social network is investigating.

Facebook issued an apology after its autocomplete feature displayed obscene search suggestions to dozens of users.

Multiple users reported seeing sexually explicit and violent search results on Thursday night after typing “video of” into Facebook. Results included “video of girl sucking dick underwater,” “videos of school shooting” and “videos of child abuse.” The issue was reportedly fixed within a few hours.

https://twitter.com/mollylolzzz/status/974476838159331328

Facebook said it was investigating the incident but didn’t provide additional information. The autocomplete feature is designed to suggest the most popular search results using predictions that direct people to profiles and pages. This differs from most other search engines that use previously searched words and trending topics.

The search feature still appears to be acting up. In our own test, autocorrect suggested bizarre topics like “zodwa wabantu videos and pics” and “cristiano ronaldo hala madrid king video call.” Other users are reporting similarly concerning results that don’t relate to Facebook or their search history.

Facebook apologized in a statement to Mashable for the behavior of its rogue tool and emphasized that it doesn’t allow sexually explicit content.

“We’re very sorry this happened. As soon as we became aware of these offensive predictions we removed them. Facebook search predictions are representative of what people may be searching for on Facebook and are not necessarily reflective of actual content on the platform. We do not allow sexually explicit imagery, and we are committed to keeping such content off of our site. We are looking into why these search predictions appeared, and going forward, we’re working to improve the quality of search predictions.”

This comes just days after the social network was criticized for posting a survey that asked users if pedophilia was OK. Participants could respond by saying the behavior should be allowed, shouldn’t be allowed, or no preference. Facebook said the survey was a mistake.

The Daily Dot has reached out to the company to learn more about the state of its autocorrect tool, and we’ll update this article if we hear back.

Phillip Tracy

Phillip Tracy

Phillip Tracy is a former technology staff writer at the Daily Dot. He's an expert on smartphones, social media trends, and gadgets. He previously reported on IoT and telecom for RCR Wireless News and contributed to NewBay Media magazine. He now writes for Laptop magazine.