- Lyft received a whopping 7 sexual assault lawsuits in a day Wednesday 10:00 PM
- High school reopens investigation into Nazi salute video after other racist videos emerge Wednesday 7:14 PM
- Facebook content moderators continue to suffer from brutal working conditions Wednesday 5:58 PM
- #RIPReese: Man bullied for relationship with trans woman dies by suicide Wednesday 4:46 PM
- Redaction error reveals ICE is paying Palantir $49 million Wednesday 4:25 PM
- People are using social media to raise awareness about the Amazon fires Wednesday 4:24 PM
- How to watch ‘Detective Pikachu’ right now Wednesday 3:56 PM
- Walmart is suing Tesla over fires at stores with solar panels Wednesday 3:44 PM
- Jeremy Renner asks nicely for Sony to let Spider-Man back in the MCU Wednesday 2:51 PM
- The best and safest torrenting sites you should be using in 2019 Wednesday 2:47 PM
- ‘Beyoncé’s Assistant for a Day’ creator is releasing more games on storytelling app Yarn Wednesday 1:54 PM
- Why does everyone keep falling for that Instagram and Facebook hoax? Wednesday 1:46 PM
- A bunch of celebrities fell for that viral Instagram hoax Wednesday 1:17 PM
- Former Die Antwoord crew member says video shows ‘homophobic attack’ Wednesday 1:13 PM
- How to stream all the MLS Rivalry Week matches Wednesday 1:13 PM
Multiple users reported seeing sexually explicit and violent search results on Thursday night after typing “video of” into Facebook. Results included “video of girl sucking dick underwater,” “videos of school shooting” and “videos of child abuse.” The issue was reportedly fixed within a few hours.
Facebook said it was investigating the incident but didn’t provide additional information. The autocomplete feature is designed to suggest the most popular search results using predictions that direct people to profiles and pages. This differs from most other search engines that use previously searched words and trending topics.
The search feature still appears to be acting up. In our own test, autocorrect suggested bizarre topics like “zodwa wabantu videos and pics” and “cristiano ronaldo hala madrid king video call.” Other users are reporting similarly concerning results that don’t relate to Facebook or their search history.
Facebook apologized in a statement to Mashable for the behavior of its rogue tool and emphasized that it doesn’t allow sexually explicit content.
“We’re very sorry this happened. As soon as we became aware of these offensive predictions we removed them. Facebook search predictions are representative of what people may be searching for on Facebook and are not necessarily reflective of actual content on the platform. We do not allow sexually explicit imagery, and we are committed to keeping such content off of our site. We are looking into why these search predictions appeared, and going forward, we’re working to improve the quality of search predictions.”
This comes just days after the social network was criticized for posting a survey that asked users if pedophilia was OK. Participants could respond by saying the behavior should be allowed, shouldn’t be allowed, or no preference. Facebook said the survey was a mistake.
The Daily Dot has reached out to the company to learn more about the state of its autocorrect tool, and we’ll update this article if we hear back.
Phillip Tracy is a former technology staff writer at the Daily Dot. He's an expert on smartphones, social media trends, and gadgets. He previously reported on IoT and telecom for RCR Wireless News and contributed to NewBay Media magazine. He now writes for Laptop magazine.