- Sticker warns men changing diapers about ‘feminization of the American male’ 4 Years Ago
- The genius way Genius caught Google allegedly stealing lyrics Today 3:03 PM
- This bubble tea challenge is a balancing act Today 2:15 PM
- Laura Dern gifts the internet with more ‘Big Little Lies’ memes Today 1:54 PM
- The Stonks meme is back—and it’s weirder than ever Today 1:27 PM
- Video shows officer threatening to shoot pregnant Black woman in front of her children Today 1:12 PM
- Netflix’s ‘Leila’ tells a familiar dystopian horror story Today 12:37 PM
- O.J. Simpson says in Twitter video that he never slept with Kris Jenner Today 12:06 PM
- GOP commissioner jokes on Facebook about running over Trump protesters Today 11:52 AM
- 2 trans women killed within 3 months in the same neighborhood Today 11:35 AM
- DNC tries to pander with tone-deaf Beyoncé meme, fails miserably Today 10:45 AM
- Parkland grad says Harvard rescinded offer after racist comments surfaced Today 10:10 AM
- ‘The Edge of Democracy’ chronicles the downfall of Brazil’s political leaders Today 9:42 AM
- Suzanne Collins is writing a ‘Hunger Games’ prequel Today 9:31 AM
- KSI rips Logan Paul for delay in their YouTube boxing rematch Today 9:02 AM
We tested out which search engines pointed suicidal users to prevention resources. The results might surprise you.
The New York Times reports that Germanwings copilot Andreas Lubitz researched cockpit security and suicide methods shortly before his likely death by suicide on the March 24, 2015 crash of Germanwings flight 9525. Lubitz took down the entire passenger and crew manifest with him in a dramatic crash that appears to have been deliberate, according to the most recent available information, though full flight recorder information still needs to be recovered and reviewed. While multiple failures appear to have contributed to the Germanwings tragedy, one possible roadblock could have been put in the way of his desperate search for answers to his emotional struggles: his search results.
Police have not revealed the details of specific searches or the search engines used, though it seems highly likely that the copilot was using Google, given that it occupies a significant percentage of search engine marketshare. The other major English-language search engines are Bing, Yahoo, and AOL—all of which, along with Google, provide special tags and informational notices at the top of certain sensitive searches.
Terms like “suicide,” “I want to kill myself,” and “suicide methods” would appear to be obvious flags for outreach at the top of a page of search results, as search engines could provide links to suicide resources like the National Suicide Prevention Lifeline in the United States or TelefonSeelsorge in Germany. In addition to providing a fast link to resources, search engines could also tweak their top searches for terms like these to provide links to help.
For people like Lubitz, this could be particularly important. Desperate people seeking help and turning to the Internet might find hotlines beneficial, but in his case, according to reports, his search history also indicated that he researched medical treatments and other therapies available to manage depression. Had a suicide hotline immediately popped up, he might have taken advantage of the resource to call for help and get advice from a trained crisis counselor.
This became an issue for Apple in 2013, when the company finally released some updates to Siri after pressure from members of the public and crisis counselors. Advocate Summer Beretsky needed to spend nearly 10 minutes pleading with Siri for help in 2012 before Apple’s digital assistant grudgingly pulled up some suicide resources, and it created the media firestorm Apple needed to make some critical changes to Siri’s programming.
Apple learned from its mistakes, and other tech companies should have done the same. Without knowing which search engine he used, it’s difficult to tell what Lubitz might have seen, but the differences between major search engines is stark. Troublingly, many offer little comfort to people with suicidal ideation who are looking for resources.
The following searches were conducted on April 4, 2015, using Safari’s “private browsing” function with the goal of obtaining reasonably unbiased results. Depending on the user and browsing history, results may vary, but links to resources are likely to remain consistent across users, browsers, and search engines; if Yahoo, for example, presents a “blank slate” user with suicide resources, it’s likely to do the same for someone with an actual browsing history.
Shockingly, Google provided the most minimal options for users, with no prominent link to a suicide prevention hotline for “suicide,” “I want to kill myself,” and “suicide methods.” Some searches provided links to news followed by straight search results, while others offered simply search results. While many of these results related to suicide prevention and outreach, none of them were presented as a bold alert at the top of the page for a desperate user. Some sites within the first page actively promoted self-harming behaviors.
Bing, with the next highest percentage of market share, turned out to offer the most useful resources. For all three searches, it provided the phone number for the National Suicide Prevention Lifeline along with a link to connect users to immediate resources. It did not, however, provide a TTY number for d/Deaf and hard of hearing people (800-799-4889) or a link to a direct chat service.
On Yahoo, all three searches yielded the National Suicide Prevention Lifeline’s phone number at the top of the search results, though the search engine does not link directly to the organization or its services. AOL provides similar results for “I want to kill myself” and “suicide” but offers only straight results for “suicide methods.”
While high-ranked search results for “suicide” and “I want to kill myself” with all four major search engines tended to lean towards resources for people in crisis, the same was not true of “suicide methods.” In that case, the search engines provided a mix of links, including some providing people with detailed information on how to kill themselves with the greatest degree of efficiency and efficacy. While such sites may be highly ranked in terms of internal algorithms, they’re dangerous results for desperate people to stumble upon, illustrating that for certain kinds of searches, companies can and should game their own searches.
One such example came up in 2004, when Google was alerted to the fact that searching for “Jew” turned up some anti-Semitic search results. The company added a link in its “sponsored” column to point users at an explanation for the results, noting that it remained committed to the integrity of its results despite their unpleasant contents (the explanation has since been removed from Google’s servers). The controversy revealed that search firms aware of potentially controversial searches and have the power to flag and address certain keywords, and searches related to suicide are clear targets for such moves.
Instead, many major search engines provide helpful autocompletes like “suicide methods” (Google and AOL, if you start to type “sui—”) and “suicide notes” (Yahoo). Bing sticks to autocompletes like “suicide prevention,” though all four helpfully direct users to Suicide Silence and Suicide Squad, just in case their searches are pop culture related.
Search is getting smarter, with algorithms going beyond simple ranking and into complex relationships between search engines, users, and individual websites. Addressing some of the most common keywords related to suicide with a simple link to resources at the top would be a simple move that could save lives, as Bing, Yahoo, and AOL apparently already believe. Google, meanwhile, doesn’t provide basic outreach to users.
Offering suicide resources at the top of search results won’t compromise them; in the case of other search engines, it’s very clear that the suicide prevention hotline is a separate informational box at the top of the search, with results progressing below. It can certainly be added without confusing users, and when it comes to an issue as serious as severe depression and suicidal thoughts, a small step goes a long way.
Adding a link to a suicide prevention hotline likely wouldn’t have prevented the Germanwings crash, and there’s no way to tell what might have changed Lubitz’ mind. Providing that resource, however, might have helped him consider better options—and it could certainly help people in the future.
Photo via Time to Change/Newscast Online
s.e. smith is a Northern California-based journalist and writer focusing on social justice issues. smith's work has appeared in publications like Esquire, the Guardian, Rolling Stone, In These Times, Bitch Magazine, and Pacific Standard.