Internet Culture

This isn’t the first time Facebook has manipulated you—and it won’t be the last

Our fear isn’t truly that Facebook is using us as pawns. It’s that we may be dumb enough to let them.

Photo of Gillian Branstetter

Gillian Branstetter

Article Lead Image

To those upset about this weekend’s news—wherein Facebook company researchers altered users’ News Feeds to see if they could negatively or positively influence the unwitting user’s mood—I have some really, really bad news for you: This is happening to you all the time.

Featured Video

In the experiment, Facebook changed the News Feed for nearly 700,000 members of their userbase, altering the occurrence of several “negative” words and several “positive” words. They then analyzed the posts generated by those users to see if the number of times they saw depressing content made them generate more negative or positive posts, repeating this for those who saw more cheery, optimistic content on their News Feed.

The result? Positive content on Facebook makes you want to post more positive content, but not much. “The actual impact on people in the experiment was the minimal amount to statistically detect it,” posted Adam Kramer, the author of the paper, in response to the ensuing uproar.  “The result was that people produced an average of one fewer emotional word, per thousand words, over the following week.”

To be clear, this is what Facebook does every day it’s running. Facebook lives and dies by how well it can collect, organize, and analyze its users’ data and the product it creates from its findings. The algorithm used to determine what posts appear at the top of your News Feed is a trade secret—and very well may factor in such detailed minutiae as the number of positive or negative words in a post. No matter what it contains, its goal is to manipulate you into doing one thing: using Facebook more.

Advertisement

If that’s scary to you, then the entire Internet should be a carnival of invasive horrors—and maybe it is. Nearly every major location of the Internet collects data on you and alters what you see in order to get a specific behavioral response out of you. It’s not mind control or some Jedi mind trick: It’s called marketing.

The business models of Facebook, Google, Amazon, and any online service that attempts to recommend anything to you—be it music or movies or articles—relies on research much like Facebook has been “caught” doing. Facebook didn’t censor your friends’ posts or create false ones; it merely altered the order and frequency with which you see posts of a certain nature.

That business practice is what has made Facebook so enormously successful. It measures your activity, your friends’ activity, and your reaction to your friends’ activity. What’s the difference between finding out what you “like” and what makes you “happy?” Google does it with search results, Pandora does it with music suggestions, and Amazon does it with just about everything. This really shouldn’t be breaking news.

What’s actually of more importance than this research is the uproar over it. As headlines popped up all weekend citing how Facebook “manipulated users’ emotions” for a “psychology experiment,” reactions on Twitter, Reddit, and Facebook itself seemed to have thought users were unwittingly the victims of some digital MKUltra. Scan down the comments on Kramer’s explanation and pseudo-apology, and you’ll see comparisons to the Milgram experiments, the Stanford Prison experiments, and even the vile Tuskegee syphilis studies.

Advertisement

This type of reaction shows a deep misunderstanding of how the very structure of social media exists. In fact, it blatantly ignores a century of public relations and advertising built upon the idea that you can manipulate people’s emotions in order to get a sustained reaction, such as buying life insurance, going to a movie, or writing a Facebook post.

What is conclusively true, however, is this development as a research method is unethical from a scientific standpoint. Subjects in any experiment must have informed consent, and the idea that a single line of text in a massively-unreadable user agreement fulfills that standard is nonsense. Whereas you can freely gather online data—say, from public Twitter feeds—without such consent, actively changing the environment of a user requires a bit more, according to federal regulations of human experimentation.

But that does not mean what Facebook did is inherently wrong. Rather, if this experiment offends you, than Facebook itself should offend you. The truth is this was an attempt to find out what makes people leave Facebook and how to keep them there.

“We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out,” writes Kramer. “At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”

Advertisement

Note that the primary concern here is not how you feel, but simply whether you’ll stay on Facebook to talk about it. In this way, it’s no different than a coffee house painting calm colors on the walls: If you would stay in a store full of clashing patterns, they’d do that, too. If you’d buy the old Coke after millions were spent developing New Coke, they’d drop the new version like a bad habit. And if more negative content made you stay on Facebook, they’d give it to you in droves.

That’s what the Facebook News Feed is: a curated lists of things to make you stay on Facebook. It’s one of the reasons I think Facebook Director of Product Mike Hudack criticized clickbait: by encouraging you to go somewhere other than Facebook, it’s effectively their main competition.

Indeed, it’s not just social media: websites like Buzzfeed, Vox, and, yes, the Daily Dot “manipulate” you into watching videos, reading listicles, and caring about things that you either wouldn’t care about otherwise or don’t actually matter (or both).

This supposed deception is the only thing Facebook is guilty of in attempting to change your mood. One could also mention that many companies—online and otherwise—actually alter the chemistry of your brain to create an addiction, be it for the sugars and fats in their food or the reward system of a Flash game.

Advertisement

And if Facebook is toying with your emotions by slightly adjusting the content it provides, what of the news media? What of bias at Fox News and MSNBC? What of the lowbrow elitism of The Onion? What of this article you’re reading right now?

A common trope in advertising and politics is most people say advertising doesn’t work on them, and yet sales figures show it does. Each of us likes to think of ourselves as a savvy consumer, above the sheeple that could be so easily led into the fires by slick Corporate America. So when it’s revealed that a company we use everyday is blatantly toying with our emotions—not hiding it under the guise of “micromarketing” or “R&D”—we want to feel outraged because we feel we’ve been duped.

What we fail to realize is even the honest reactions we have to products or websites have been carefully curated by experts on high. The entire business model of Silicon Valley is bent on learning the most about you to more properly “manipulate” you into feeling good when using their service. Our fear isn’t truly that Facebook is using us as pawns—it’s that we may be dumb enough to let them.

Photo via Wikimedia Commons/Pinocchio

Advertisement
 
The Daily Dot