- Redditor wants to know if he’s the a**hole for ghosting pregnant partner Thursday 8:19 PM
- How to go live on TikTok Thursday 8:08 PM
- Joey Salads suggests Democrats carried out Santa Clarita mass shooting Thursday 7:31 PM
- How influencers use TikTok to make money and launch careers Thursday 7:18 PM
- How to stream Argentina vs. Brazil live Thursday 6:51 PM
- How to watch Disney+ on a smart TV Thursday 6:28 PM
- Miss Fame calls out Justin Bieber for low music video appearance pay offer Thursday 6:19 PM
- Trump Jr. ranked No. 1 on best-seller list—after the GOP gave away copies of his book Thursday 5:45 PM
- How to get Disney+ bundle if you already subscribe to Hulu and/or ESPN+ Thursday 5:19 PM
- Mo’Nique suing Netflix for race and gender discrimination Thursday 5:09 PM
- Students outraged that professors accused of sexual misconduct are still teaching Thursday 5:00 PM
- TikTok users jokingly wear big hats to sneak snacks into movie theaters Thursday 3:59 PM
- Why today’s new facially recognition bill is being called ‘woefully’ inadequate Thursday 3:15 PM
- Facebook has given more user data to the government than ever before Thursday 2:57 PM
- How to sign up for Disney Plus Thursday 2:55 PM
How do search engines respond when you Google ‘suicide’?
We tested out which search engines pointed suicidal users to prevention resources. The results might surprise you.
The New York Times reports that Germanwings copilot Andreas Lubitz researched cockpit security and suicide methods shortly before his likely death by suicide on the March 24, 2015 crash of Germanwings flight 9525. Lubitz took down the entire passenger and crew manifest with him in a dramatic crash that appears to have been deliberate, according to the most recent available information, though full flight recorder information still needs to be recovered and reviewed. While multiple failures appear to have contributed to the Germanwings tragedy, one possible roadblock could have been put in the way of his desperate search for answers to his emotional struggles: his search results.
Police have not revealed the details of specific searches or the search engines used, though it seems highly likely that the copilot was using Google, given that it occupies a significant percentage of search engine marketshare. The other major English-language search engines are Bing, Yahoo, and AOL—all of which, along with Google, provide special tags and informational notices at the top of certain sensitive searches.
Terms like “suicide,” “I want to kill myself,” and “suicide methods” would appear to be obvious flags for outreach at the top of a page of search results, as search engines could provide links to suicide resources like the National Suicide Prevention Lifeline in the United States or TelefonSeelsorge in Germany. In addition to providing a fast link to resources, search engines could also tweak their top searches for terms like these to provide links to help.
For people like Lubitz, this could be particularly important. Desperate people seeking help and turning to the Internet might find hotlines beneficial, but in his case, according to reports, his search history also indicated that he researched medical treatments and other therapies available to manage depression. Had a suicide hotline immediately popped up, he might have taken advantage of the resource to call for help and get advice from a trained crisis counselor.
This became an issue for Apple in 2013, when the company finally released some updates to Siri after pressure from members of the public and crisis counselors. Advocate Summer Beretsky needed to spend nearly 10 minutes pleading with Siri for help in 2012 before Apple’s digital assistant grudgingly pulled up some suicide resources, and it created the media firestorm Apple needed to make some critical changes to Siri’s programming.
Apple learned from its mistakes, and other tech companies should have done the same. Without knowing which search engine he used, it’s difficult to tell what Lubitz might have seen, but the differences between major search engines is stark. Troublingly, many offer little comfort to people with suicidal ideation who are looking for resources.
The following searches were conducted on April 4, 2015, using Safari’s “private browsing” function with the goal of obtaining reasonably unbiased results. Depending on the user and browsing history, results may vary, but links to resources are likely to remain consistent across users, browsers, and search engines; if Yahoo, for example, presents a “blank slate” user with suicide resources, it’s likely to do the same for someone with an actual browsing history.
Shockingly, Google provided the most minimal options for users, with no prominent link to a suicide prevention hotline for “suicide,” “I want to kill myself,” and “suicide methods.” Some searches provided links to news followed by straight search results, while others offered simply search results. While many of these results related to suicide prevention and outreach, none of them were presented as a bold alert at the top of the page for a desperate user. Some sites within the first page actively promoted self-harming behaviors.
Bing, with the next highest percentage of market share, turned out to offer the most useful resources. For all three searches, it provided the phone number for the National Suicide Prevention Lifeline along with a link to connect users to immediate resources. It did not, however, provide a TTY number for d/Deaf and hard of hearing people (800-799-4889) or a link to a direct chat service.
On Yahoo, all three searches yielded the National Suicide Prevention Lifeline’s phone number at the top of the search results, though the search engine does not link directly to the organization or its services. AOL provides similar results for “I want to kill myself” and “suicide” but offers only straight results for “suicide methods.”
While high-ranked search results for “suicide” and “I want to kill myself” with all four major search engines tended to lean towards resources for people in crisis, the same was not true of “suicide methods.” In that case, the search engines provided a mix of links, including some providing people with detailed information on how to kill themselves with the greatest degree of efficiency and efficacy. While such sites may be highly ranked in terms of internal algorithms, they’re dangerous results for desperate people to stumble upon, illustrating that for certain kinds of searches, companies can and should game their own searches.
One such example came up in 2004, when Google was alerted to the fact that searching for “Jew” turned up some anti-Semitic search results. The company added a link in its “sponsored” column to point users at an explanation for the results, noting that it remained committed to the integrity of its results despite their unpleasant contents (the explanation has since been removed from Google’s servers). The controversy revealed that search firms aware of potentially controversial searches and have the power to flag and address certain keywords, and searches related to suicide are clear targets for such moves.
Instead, many major search engines provide helpful autocompletes like “suicide methods” (Google and AOL, if you start to type “sui—”) and “suicide notes” (Yahoo). Bing sticks to autocompletes like “suicide prevention,” though all four helpfully direct users to Suicide Silence and Suicide Squad, just in case their searches are pop culture related.
Search is getting smarter, with algorithms going beyond simple ranking and into complex relationships between search engines, users, and individual websites. Addressing some of the most common keywords related to suicide with a simple link to resources at the top would be a simple move that could save lives, as Bing, Yahoo, and AOL apparently already believe. Google, meanwhile, doesn’t provide basic outreach to users.
Offering suicide resources at the top of search results won’t compromise them; in the case of other search engines, it’s very clear that the suicide prevention hotline is a separate informational box at the top of the search, with results progressing below. It can certainly be added without confusing users, and when it comes to an issue as serious as severe depression and suicidal thoughts, a small step goes a long way.
Adding a link to a suicide prevention hotline likely wouldn’t have prevented the Germanwings crash, and there’s no way to tell what might have changed Lubitz’ mind. Providing that resource, however, might have helped him consider better options—and it could certainly help people in the future.
Photo via Time to Change/Newscast Online
s.e. smith is a Northern California-based journalist and writer focusing on social justice issues. smith's work has appeared in publications like Esquire, the Guardian, Rolling Stone, In These Times, Bitch Magazine, and Pacific Standard.