- People are not falling for these ICE ‘propaganda’ photos Sunday 4:23 PM
- CLIF Bar and KIND Snacks are in a bizarre social media war Sunday 2:55 PM
- Caillou is how tall? Sunday 1:32 PM
- No, that video of a Boston Dynamics robot attacking its creators is not real Sunday 12:40 PM
- Alex Jones places $1 million bounty on culprit who planted child porn on his InfoWars server Sunday 12:03 PM
- ‘Stranger Things’ star’s new Netflix prank show is receiving backlash Sunday 9:04 AM
- How to watch ‘City on a Hill’ for free Sunday 8:00 AM
- How to watch ‘Euphoria’ for free Sunday 7:00 AM
- Meet the home brewer turning beer into a case for net neutrality Sunday 6:30 AM
- How to watch the U.S. vs. Chile at the World Cup for free Sunday 6:15 AM
- 15 teen movies on Netflix that will make you laugh, cry, and cringe Sunday 6:00 AM
- How to watch Estrella TV online for free Sunday 5:00 AM
- People are roasting this ‘traditional’ take on marriage with a hilarious meme Saturday 5:17 PM
- The internet just collectively realized that the Neopets of the world must be hungry Saturday 4:00 PM
- Alt-right message board 8chan was served a search warrant Saturday 3:06 PM
The social network is investigating.
Multiple users reported seeing sexually explicit and violent search results on Thursday night after typing “video of” into Facebook. Results included “video of girl sucking dick underwater,” “videos of school shooting” and “videos of child abuse.” The issue was reportedly fixed within a few hours.
Facebook said it was investigating the incident but didn’t provide additional information. The autocomplete feature is designed to suggest the most popular search results using predictions that direct people to profiles and pages. This differs from most other search engines that use previously searched words and trending topics.
The search feature still appears to be acting up. In our own test, autocorrect suggested bizarre topics like “zodwa wabantu videos and pics” and “cristiano ronaldo hala madrid king video call.” Other users are reporting similarly concerning results that don’t relate to Facebook or their search history.
Facebook apologized in a statement to Mashable for the behavior of its rogue tool and emphasized that it doesn’t allow sexually explicit content.
“We’re very sorry this happened. As soon as we became aware of these offensive predictions we removed them. Facebook search predictions are representative of what people may be searching for on Facebook and are not necessarily reflective of actual content on the platform. We do not allow sexually explicit imagery, and we are committed to keeping such content off of our site. We are looking into why these search predictions appeared, and going forward, we’re working to improve the quality of search predictions.”
This comes just days after the social network was criticized for posting a survey that asked users if pedophilia was OK. Participants could respond by saying the behavior should be allowed, shouldn’t be allowed, or no preference. Facebook said the survey was a mistake.
The Daily Dot has reached out to the company to learn more about the state of its autocorrect tool, and we’ll update this article if we hear back.
Phillip Tracy is a former technology staff writer at the Daily Dot. He's an expert on smartphones, social media trends, and gadgets. He previously reported on IoT and telecom for RCR Wireless News and contributed to NewBay Media magazine. He now writes for Laptop magazine.