Not even death can stop fake Facebook likes
Is Facebook rigged?
In a very detailed post, ReadWrite’s Bernard Meisler recounts how the social network giant has been doing questionable things, such as inflating a page’s likes with possible bot accounts, liking pages on people’s behalf without their permission, and doing the latter for people who have passed away.
It all started when Meisler noticed that the Facebook page for an arts magazine he publishes wasn’t reaching the same number of people as it had before. He chalked it up to the popularly held belief that the social network was purposefully throttling pages’ reach so that the owners would use the Sponsored Posts feature.
Meisler decided to bite the bullet and doled out an undisclosed amount of money to promote his page’s content. The strategy seemed to work. More people were liking his posts. A good thing, right?
Not when the likes were coming from people who reside in non-English speaking countries.
“My number of posts views did indeed go up, according [to] the statistics provided to me by Facebook,” Meisler wrote in his column, “but these likes didn’t seem like ‘quality’ views or likes.
“I was getting likes from folks in South America, the Middle East and Eastern Europe. Comments were posted not only in languages I couldn’t understand, but in alphabets I didn’t recognize. This was suboptimal— not to mention extremely weird— for a literary magazine written in English."
Meisler’s discovery of the suspicious likes seems to confirm a BBC study that found that fake accounts— most likely bots— were artificially inflating the number of likes for various brand pages. These fraudulent Facebook profiles originated from countries like Egypt and the Philippines.
Facebook has always denied this assertion. The company also contended that fake accounts constituted a very minute portion of all active accounts. The latter proved to be false when the company submitted a quarterly filing in August 2012 to the United States Securities and Exchange Commission. Facebook revealed that roughly 8.7 percent of all accounts on the site were fake, a figure that translates to about 83 million accounts.
In September, Facebook stated that it was cracking down on fake likes, announcing that it had purged various likes from Pages that were “gained by malware, compromised accounts, deceived users, or purchased bulk Likes.” Apparently, Facebook didn’t do a very thorough job.
As if the possible fake likes weren’t enough, Meisler noticed yet another type of suspicious Facebook activity. His feed began showing his friends liking brands that they never would have liked in real life.
“Then I started noticing something else. ‘Sponsored Posts’ were popping up near the top of my newsfeed, and some of them made no sense,” he noted.
“A number of my liberal friends supposedly had ‘liked’ Mitt Romney, for instance. And my friend Nicolala, a high school English teacher from San Francisco, had ‘liked’ WalMart.”
Meisler reached out to his friend Nicolala to see if she had purposefully liked the retailer’s brand page and she responded with a resounding “no.” He got the same answer from other friends who had supposedly liked pages that were out of character, posting screenshots of their responses in his article.
That Facebook was liking brands for people without their consent was bad, but it wasn’t the worst thing that Meisler discovered. Two of the responses he got revealed that the social network had liked pages for people who were no longer alive.
“But at least those people were alive when they fake-liked something,” he writes. “Emma Kumakura, a game designer in Palo Alto, Calif, was outraged to see her friend Alice Mizer Stewart, who had passed away in March, shilling for a tea company.”
How did this happen? Facebook told Meisler that unless someone memorializes the account belonging to someone who has passed away, it’s business as usual for the social network, meaning that “someone’s ‘likes’ from months and months ago can still keep surfacing in the news feeds of their friends.”
As per the likes coming from still-living people, Facebook chalked it up to the possibility that they clicked the like button by mistake.
Another possibility that Facebook didn’t tell Meisler is that the company scanned those people’s private messages and liked pages for them. In October, the company revealed that it was engaging in this type of behavior.
Whatever the case may be, Meisler—and everyone else, for that matter—has valid reason to be concerned. There’s no telling what will come of Facebook’s recently acquired ability to change its terms of service without being accountable to its userbase. But if the past is any indication, it’s going to be making for a less enjoyable user experience.
Photo via Johannes Fuchs/Flickr