Advertisement
Tech

Could Wikipedia lose immunity under the Communications Decency Act?

The case for classifying the world’s largest free encyclopedia as a publisher, like Backpage.com before it

Photo of Claire Goforth

Claire Goforth

wikipedia on tablet
M-SUR/Shutterstock (Licensed)

We’ve come to rely on Wikipedia to settle debates, answer questions, and generally scratch that curiosity itch. Free, easy to use, and available 24/7/365, the internet’s user-generated encyclopedia is convenient and, more often than not, correct. Since Wikipedia’s humble beginnings as an unreliable work-in-progress, it has amassed a loyal legion of fans, enabling it to grow and expand both its subject matter and accuracy many times over.

Featured Video

This is due in large part to a dedicated contingency of volunteers who, along with bots, constantly comb new entries and edits to confirm their veracity and conformity to Wikipedia’s sensible, highly nuanced standards.

Within this massive endeavor to make Wikipedia as reliable as possible lies the problem: For a site to maintain its immunity under the Communications Decency Act, it must function as a platform, not a publisher. A finding thatthe world’s largest free encyclopedia is a publisher would puncture its immunity and open it to suit. Wikipedia’s undeniably good-faith effort to empower volunteers to consistently and constantly purge it of inaccurate, malicious, defamatory posts and edits, as well as create new pages and update existing ones, gives rise to serious questions about which category it belongs to.

The fact that editors, even those who have risen to the level of administrators (also called admins, and system operators or sysops), are volunteers may be irrelevant, for Wikipedia utilizes role-based access control to determine who can perform which tasks.

Advertisement

Wikipedia, in the end, controls the editors and can give or take away their access as it sees fit.

In the fall of 2018, a Google search of Andrew Gillum, then the Democratic candidate for governor of Florida showed why that mattered.

At the time, the preview box on the results, which pulled content from Wikipedia, incorrectly said that Gillum had died in Miami the day before. A closer look at the history of Gillum’s Wikipedia page over the following days revealed the tracks of dozens of edits, several deemed “vandalism” by Wikipedia editors, a few with clearly false, potentially defamatory statements.

One edit said Gillum was “a liar who has done nothing in Tallahassee for the crime rate”; another, nearly a month before the election said he had “lost to Ron DeSantis”; one of the more clever edits, by the tellingly named editor “gohomegillum,” replaced the embedded link to his official website with an anti-Gillum page. Eventually, an administrator gave the page temporary “protected” status based on “persistent vandalism.”

Advertisement

The edit history of Gillum’s page is hardly unique. The back-and-forth on pages is common enough to have earned the nickname “edit wars.” A rather amusing Wikipedia page catalogues the “lamest edit wars,” such as the debate over whether Caesar salad was named for Julius or Cardini. Wikipedia frowns on this practice, preferring editors to seek consensus or even dispute resolution, and warns that “[u]sers who engage in edit warring risk being blocked or even banned.” This is just another way Wikipedia controls edits on its site.

Edits vary broadly, but even a casual observer can see patterns. The pages of high-profile or controversial individuals and organizations tend to be far more frequently edited and re-edited than those of little public interest, such as people who are deceased or have fallen out of the news; if someone or something is thrust into the spotlight, like by a major scandal, their page explodes with edits.

Political and controversial subjects are frequently targeted. Prior to actor Jussie Smollett reporting a hate crime against him that police later accused him of fabricating, his page received perhaps as many as a few dozen edits per month. From Jan. 30, the date on which the alleged hate crime was reported, until Feb. 26, his page was edited more than 800 times. The page of Rep. Alexandria Ocasio-Cortez (D-N.Y.), darling on the left and villain on the right, whose name is constantly in the news, has been edited more than 1,000 times since she won the seat in November.

Each time an edit appears on Wikipedia, like the one referring to Cortez asprone to verbal gaffes, leading some to speculate that she is not well-versed on the issues,” a bot, admin or editor will review it. If the content of an edit passes muster, the language may nevertheless be cleaned up or corrected for better readability or clarity; if it doesn’t, the revision will be undone and, often, the editor will explain their reasoning for undoing it.

Advertisement

In the prior example, the editor who undid the change wrote, “You’ve seen people object and now you’re editing without consensus to add a dedicated criticism section on a [sic] active congressmember. Please stop your POV [point of view] editing efforts here and now.” All edits are to conform with Wikipedia’s standards.

Often edits are updated or deleted within minutes. But some linger. In 2005, the New York Times reported on the case of John Seigenthaler Sr., who was stunned to learn that his Wikipedia page had for months included the libelous statement that he “was thought to have been directly involved in the Kennedy assassinations of both John and his brother Bobby.”

Efforts to identify the anonymous editor who libeled him proved fruitless and, with no other options for relief, Seigenthaler, now deceased, went public with his experience, penning an article in which he concluded, “that Wikipedia is a flawed and irresponsible research tool.”

Today, most understand the law regarding defamatory statements on Wikipedia to be much the same as it was when Seigenthaler found only roadblocks when he sought relief: that Wikipedia, like social media sites, is immune from suit for statements made by others on its pages thanks to Section 230 of the Communications Decency Act.

Advertisement

But Wikipedia is not in quite the same position as the Twitters and Instagrams of the world, for it is not merely a passive platform essentially functioning as a pass-through for others’ statements; Wikipedia, through staff, bots and volunteers it authorizes, plays an active role in content creation and augmentation.

The fact that its motivation for doing so may be noble, and it does appear to be driven solely by the overarching goal of being as fair and accurate as possible, does not change the fact that a court could find that Wikipedia has taken on such an active role that it has become what it has long resisted and denied being: a publisher. If so, then the same reasoning that exposed Backpage.com to liability could apply to Wikipedia.

For many years, regulators and lawmakers struggled with how to grapple with the illegal activity on Backpage.com, which was well known as a place where prostitution thrived, including of trafficked children. Again and again, victims’ suits against Backpage.com were dismissed on the grounds that the site was immune under the Communications Decency Act. It wasn’t until evidence piled up, demonstrating that Backpage.com had an active role creating the sex ads, that the site and its founders were hauled before a grand jury and charged with facilitating prostitution, among other crimes, on the grounds that Backpage.com was a publisher.

How is this different from Wikipedia’s role creating and altering content on its site? Clearly, the free online encyclopedia is not involved in anything approaching the evil, insidious scourge of the child sex trade, but when the logic that exposed Backpage to liability is applied to Wikipedia, is it not also a publisher? If one of its admins makes a mistake that libels someone, why is Wikipedia immune, while a media outlet whose intern makes the same mistake could be subject to suit?

Advertisement

Many will say that Wikipedia, which is operated by the nonprofit Wikimedia Foundation, is a force of good in this world—and they’re not wrong. It is also the same site credited with, however unintentionally, putting the beloved 244-year-old Encyclopedia Britannica out of business in 2012. It is also part of the conglomerate of pages that has helped strangle the newspaper industry onto life support.

And it isn’t necessarily hurting for cash, either. According to Charity Navigator, which gives Wikimedia Foundation an extremely high rating, it had nearly $90 million of revenue in FY2017, and has $113 million of assets. To put these sums in perspective, newspaper conglomerate Digital First Media posted nearly $160 million in profits that year. Digital First Media operates 97 newspapers, including the Denver Post. Wikipedia is just one site.

Even as more and more of us rely on crowd-sourced information on the internet, like that on Wikipedia, thanks to the current understanding of the Communications Decency Act, holding someone accountable for harms caused online can prove elusive, if not impossible.

Meanwhile, internet companies rake in profits and donations with such impunity that Georgetown University law professor Rebecca Tushnet referred to them as having “power without responsibility” in a frequently-cited paper.

Advertisement

Perhaps the time has come to take some of that power back.

In recent years, a growing chorus of voices have wondered whether allowing unfettered expression on the internet without recourse for victims who are defamed, have their privacy invaded, or worse, is in the public interest, whether it is in our best interest to update Section 230 of the Communications Decency Act, or for courts to reinterpret it, to allow companies to be held accountable for at least some of the conduct on their sites.

The websites argue that exposing them to any liability for statements on their sites would have a stifling effect on the internet, a massive industry that contributes verily to the economy.

Of course, this is their argument; these companies don’t want to be sued. It is possible to tailor any update to the law to allow for liability in only very narrow circumstances, to ensure suits against sites are only possible in the most exceptional circumstances.

Advertisement

When a website, such as Wikipedia or Backpage.com, takes an active role in creating, editing or otherwise augmenting content, should it not be liable for its own mistakes and transgressions, as well as those of admins and editors who derive their power from the site itself?

 
The Daily Dot