Problematic on TikTok is a weekly column that unpacks the troubling trends that are emerging on the popular platform and runs on Tuesdays in the Daily Dot’s web_crawlr newsletter. If you want to get this column a day before we publish it, subscribe to web_crawlr, where you’ll get the daily scoop of internet culture delivered straight to your inbox.
One of TikTok’s latest innovations, the “suggested search,” is yet another way that the application thinks outside the box and gives users new ways to interact with each others’ content. But, at times, it puts users’ morbid and/or prejudiced curiosity front and center.
Suggested search pops up when you tap the comments section on a video. In blue, bolded letters, phrases that other users have searched are hyperlinked to bring the viewer to the search results. Terms and phrases also pop up on the bottom of the screen on some videos, again allowing users to click and see search results.
Logical applications of the suggested search feature come up when there is controversy on the app. For example, many users are currently discussing Colleen Ballinger, a YouTuber made famous by a controversial character she played called Miranda Sings, who is currently facing grooming allegations. On videos that discuss Ballinger, one might see “colleen ballinger controversy” or “miranda sings controversy” as a suggested search term. In this way, the feature is super helpful.
But on other types of videos, the suggested search function seems to almost ridicule creators. I’ve seen multiple videos of Dylan Mulvaney or Devin Halbal, both popular trans female creators, with the suggested searches “dylan mulvaney as a boy” or “devin halbal as a boy.”
These creators have nothing to do with the suggested searches that come up on their videos, and I’d bet that they would find those phrases hurtful and transphobic. The desire to see trans people in bodies they used to inhabit or genders they used to present as is othering and makes them seem more like a circus act than a person. Users should be able to focus on Mulvaney and Halbal’s TikTok videos that show them as they present themselves now without transphobic search terms popping up.
Similarly, videos of Blue Ivy Carter, Beyoncé and Jay-Z’s eleven year old daughter have been trending on TikTok because Carter dances with her mom onstage during the Renaissance Tour. I’ve seen multiple videos of Carter with the search term “blue ivy autistic.”
As I understand it, any assumptions that Carter has autism are based off of how she dances—which is offensive toward both Carter and the autism community. She is eleven and brave enough to dance onstage in front of hundreds of thousands of people. If she even is autistic, that’s information that she should be able to share herself, not something that TikTok is encouraging users to speculate about via suggested searches.
Why it matters
TikTok allows users to report suggested searches, which feels like a double edged sword. On the one hand, it’s positive that TikTok is monitoring what suggested searches are popping up on the app to avoid problems like the ones I’ve described above.
But on the other hand, unlike other entities TikTok allows users to report (like videos or audios), suggested searches aren’t man-made. They are predictions generated by the app itself. Why create a feature that has the potential for harm as a result of algorithmic predictions?
Like what you are reading?
Sign up to receive web_crawlr, a daily newsletter
from the Daily Dot, in your inbox each morning.