Back in October, a Fox News article on apocalyptic belief systems by Dr. Robert Jeffress observed that “47 percent of American Christians believe that the end of the world as described in the Bible will occur within the next 40 years.” Around the same time—but, notably, on the opposite side of the ideological spectrum—Roger Cohen of the New York Times proclaimed that “many people I talk to, and not only over dinner, have never previously felt so uneasy about the state of the world,” before closing that “the search is on for someone to dispel the foreboding and embody, again, the hope of the world.”
A Gallup poll last week seemed to tie these sentiments together when it discovered that the number of Americans satisfied with the direction in which their country was headed had stagnated at 23 percent.
While the Eeyore-esque doomsday prophecies might be justifiable if we lived during the Black Death or Great Depression, Steven Pinker and Andrew Mack of Slate recently argued that the problems we assume are steadily worsening have, in many ways, actually gotten better. To summarize: Homicide throughout the world is in on the decline, increased awareness of women’s rights is gradually reducing violence against women, democracy is spreading, genocide and mass murdering of civilians is far less common, and even the Middle East offers sober observers valid cause for cautious optimism.
What’s more: Americans are increasingly optimistic that, for them personally, 2015 will be better than 2014, with an Associated Press/Times Square Alliance poll indicating that 48 percent thinking the upcoming year will be an improvement and 11 percent believing it will be worse.
While 2014 was a gloomy year in American culture—with Ebola and Michael Brown dominating headlines—Pinker and Mack show that Americans need to believe the sky is falling even when the weather is sunny and clear. It is a tendency that, though rooted in the journalistic culture of our mainstream media, is exacerbated by the socio-psychological conditions created by the Internet. In other words, there is a considerable disconnect between how we feel about our own lives and our need to perceive the world around us; to quote Don DeLillo in Mao II, there’s “an unremitting mood of catastrophe… We don’t even need catastrophes, necessarily. We only need the reports and predictions and warnings.”
Nowhere is this seeming paradox more vividly illustrated than in social media. On the one hand, as the New York Times reported last year, an analysis from social psychological Jonah Berger at the University of Pennsylvania revealed that when social media users chose to pass along content, “they preferred good news to bad.” According to Berger, “the more positive an article, the more likely it was to be shared.” As other neurological and psychological researchers eventually discovered, this is because social media users are conscientious not only about being interesting, but about reinforcing a positive impression about themselves to others. Indeed, as the article noted, “this social consciousness comes into play when people are sharing information about their favorite subject of all: themselves,” with 80 percent of all Twitter users sharing content about themselves—most of which, of course, was positive.
However, even though social media users often shared positive news more than negative news, the material that seems to have the greatest impact on the consumer’s worldview is that which skews pessimistic. It all has to do with how the media’s famous mantra that “if it bleeds, it leads” translates into the real world—or as Jeb Lund of the Guardian put it in his op-ed “2014 was a terrible, horrible, no good, very bad year. You probably don’t even remember why”: “Some critics credit viral social media news with intensifying this phenomenon… but that’s letting the media off too easy.”
“Fear-based news programming has two aims,” explained Dr. Deborah Serani in an article for Psychology Today. “The first is to grab the viewer’s attention. In the news media, this is called the teaser. The second aim is to persuade the viewer that the solution for reducing the identified fear will be in the news story.” While this may seem cruelly manipulative, research suggests that the media does this because their audiences instinctively gravitate toward this material. As the BBC reported earlier this year, an experiment run by McGill University that tricked subjects into reading the political stories toward which they were naturally most inclined (by telling them that only their eye movements, and not the content of what they read, would be monitored) found that “participants often chose stories with a negative tone— corruption, setbacks, hypocrisy and so on—rather than neutral or positive stories.”
The reason for this is negativity bias, which is motivated not merely by the enjoyment of others’ misfortune for its own sake, but because of evolution. According to Dr. Serani, “we’ve evolved to react quickly to potential threats.” Serani said, “Bad news could be a signal that we need to change what we’re doing to avoid danger.”
When a handful of the stories declaring 2014 to be the “worst” year are deconstructed, this evolutionary instinct becomes especially evident. For example, in Dean Obeidallah’s piece for the Daily Beast on how the summer of 2014 was the “worst ever,” he specifically mentions the shootings of Michael Brown and Eric Garner, the Arab-Israeli conflict, the spread of ISIS in Iraq and Syria, and the Russian invasion of Ukraine—all stories that speak to existing threats either faced by Americans at home or potential ones in which the nation could be mired abroad.
By simply adding “Ebola” or “cyberhacking” to that list, one also gets the bulk of the stories that Jeffress and Cohen each considered when issuing their gloomy assessments on the year. When the media focuses on stories like these, they feed on the public’s desire to feel informed about potential dangers. Because social media already caters to an instinct to feel one’s life is inadequate compared to that of those around you (which is also a very real, albeit somewhat different type, of threat), these stories inevitably add fuel to the fire.
While Jeffress is hopefully wrong about Americans becoming indifferent to these problems because they take them for granted, there are real-world political consequences to this attitude. After all, despite governing during a strong economy and with a solid record of achievement behind him, President Obama’s Democratic Party wound up suffering significant losses in the 2014 midterm elections due less to a low approval rating than to low voter turnout. While a strong job performance may not have been enough to inspire Obama’s supporters to keep his party in power, the animosity felt by his opponents did sufficiently inspire them to appear at the polls.
Beyond the immediate realm of electoral politics, however, this issue has real-world ramifications in another important way. Although it is dangerous to be ill-informed of the real dangers that exist in our world, it is unhealthy to have a distorted point-of-view that skews toward the melancholy and/or fearful. As Dr. Serani explained, being bombarded with negative news stories online has joined watching TV news or reading the newspaper as a “psychologically risky pursuit, which could undermine your mental and physical health.” This isn’t to say that Americans should close their eyes to bad news, but at the same time, the exaggeration does us no favors. Being prepared for the worst and appreciating how things have gotten better aren’t mutually exclusive attitudes.