Testing the limits of Chinese Internet censorship
Just how censored is the Internet in China, anyway? The title of a new study out of Harvard University yields a possibly surprising answer: “…Censorship in China Allows Government Criticism but Silences Collective Expression.” In other words, you can complain about whatever you think is wrong; you just can’t suggest that you and your fellow Chinese try doing anything to fix it.
Harvard researchers Gary King, Jennifer Pan, and Margaret E. Roberts co-authored the study, which they call “the first large-scale, multiple-source analysis of the outcome of what may be the most extensive effort to selectively censor human expression ever implemented.”
To do this, the researchers devised a system that automatically downloaded just-posted content from various Chinese social media before censors could remove whatever they found objectionable, then compared the content of censored and non-censored posts. The results?
Contrary to previous understandings, posts with negative, even vitriolic, criticism of the state, its leaders, and its policies are not more likely to be censored. Instead, we show that the censorship program is aimed at curtailing collective action by silencing comments that represent, reinforce, or spur social mobilization, regardless of content. Censorship is oriented toward attempting to forestall collective activities that are occurring now or may occur in the future—and, as such, seem to clearly expose government intent.
The study explains that Chinese Internet censorship comes in three forms. The first is the so-called Great Firewall of China, which prevents Chinese from accessing various foreign websites—including Facebook and Twitter—from anywhere in the country. The second is keyword blocking, or making it impossible to post anything containing certain keywords. In practice, keyword blocking does little to prevent freedom of speech “since netizens do not find it difficult to outwit automated programs [using] analogies, metaphors, satire, and other evasions.”
The third type of censorship, applied to posts that make it past the first two barriers, involves actual human censors reading content and removing whatever they deem objectionable. The Harvard study, naturally, focuses on the results of this third type: download raw content posted onto a Chinese site, then revisit the site to see if the content is removed, and how quickly.
The researchers noted that “The censors are not shy, and so we found it straightforward to distinguish (intentional) censorship from sporadic outages or transient time-out errors. The censored websites include notes such as 'Sorry, the host you were looking for does not exist, has been deleted, or is being investigated' and are sometimes even adorned with pictures of Jingjing and Chacha, Internet police cartoon characters.”
Turns out the Chinese censors work very quickly—most censored material was removed within 24 hours of its original posting, though posts occasionally stay up as long as five days before coming to a censor’s attention.
Of course, as the researchers themselves admit, there are limits to what can be gleaned from their study’s methods. They have no way of knowing how much stuff was censored before they could download it or how much self-censorship Chinese bloggers impose on themselves. Perhaps more importantly, “We have also not studied the effect of physical violence, such as the arrest of bloggers, or threats of the same.”
Turns out approximately 13 percent of all collected Chinese social media posts were censored. While discussing the coding methods used in the study, the researchers mentioned how “conversation in social media in almost all topic areas (and countries) is well known to be highly ‘bursty,’ that is, with periods of stability punctuated by occasional sharp spikes in volume around specific subjects. We also found that with only two exceptions—pornography and criticisms of the censors, described below—censorship effort is often especially intense within volume bursts.”
In other words, when lots of Chinese suddenly start blogging about a particular topic, the Chinese censors will pay close attention. Most of the censors’ focus is on political topics but any sort of call for collective action is censored, even when it’s entirely non-political.
Notably, one of the highest ‘collective action potential’ events was not political at all: following the Japanese earthquake and subsequent meltdown of the nuclear plant in Fukushima, a rumor spread through Zhejiang province that the iodine in salt would protect people from radiation exposure, and a mad rush to buy salt ensued. The rumor was biologically false, and had nothing to do with the state one way or the other, but it was highly censored; the reason appears to be because of the localized control of collective expression by actors other than the government. Indeed, we find that salt rumors on local Web sites are much more likely to be censored than salt rumors on national Web sites.
The researchers listed other examples of censorship before reiterating that “collective organization is not tolerated by the censors, regardless of whether it supports the government or criticizes it.”
In conclusion, individual Chinese citizens apparently do enjoy a great deal of freedom to criticize their government, leaders and laws, to the point where “government policies sometimes look as bad, and leaders can be as embarrassed, as is often the case with elected politicians in democratic countries, but, as they seem to recognize, looking bad does not threaten their hold on power so long as they manage to eliminate discussions associated with events that have collective action potential.” Indeed, by allowing criticism, and thus getting a feel for what the people are thinking, while simultaneously stifling all collective actions (even those as innocuous as a fad for buying extra salt), the Chinese government might actually be stronger and more stable than if it tried stifling criticism in addition to collective action. This has worrying implications; the researchers said that China “is probably being watched closely by autocrats from around the world.”
Art by Jason Reed for the Daily Dot