The case was brought by Dylan Voller, an Aboriginal man who spent time in detention centers as a juvenile. He is suing media companies over comments made on their Facebook pages following an ABC investigative report that showed the poor treatment he was receiving behind bars.
The comments involved were written by readers on posts on Facebook pages of The Sydney Morning Herald, The Australian, The Centralian Advocate, Sky News, and The Bolt Report, according to ABC News.
Supreme Court of New South Wales Justice Stephen Rothman ruled on Monday that the publishers are responsible for comments made by outside parties.
In the civil case Voller brought against Fairfax Media, Nationwide News, and Sky News, he said the companies should have predicted a “significant risk of defamatory observations.”
He said many comments, such as those calling him a “rapist” and alleging that he “savagely bashed” and injured a Salvation Army officer, were falsely claimed and amounted to defamation, according to BuzzFeed News.
The ruling comes a few months after the media companies argued during a hearing that they were not liable. Social media managers were questioned at the hearing of their roles and responsibilities, BuzzFeed reported in February.
Social media consultant Ryan Shelley’s testimony was perhaps one of the instrumental ones. He argued that publishers cannot turn the comments section off but can “hack” the system to ensure offensive posts are contained.
The “hack” would require moderators to filter sentences with the most common words such as “a” or “the” in order to ensure any comments with those words—likely any sentence—would be hidden from the public. But this workaround wouldn’t hide comments containing images or a single word, which could be profanities such as “criminal,” as BuzzFeed noted.
The case is being considered a “significant” victory for Voller. But it’s also concerning precedent for media companies, which could now be held liable for comments made by readers on social media platforms.
The regulation of social media content is the subject of debate across the globe. In the U.S., Section 230 of the Communications Decency Act protects social media companies from what’s being posted on their platforms. While this case upholds that social media platforms themselves are not liable for defamatory comments, it’s likely to cause concern among publishers who host pages on social media and could usher in a new debate on how hate speech should be moderated when it’s coming from readers and followers.
- Study claims Facebook statuses can reveal people’s medical conditions
- 72 officers removed from patrol over ‘offensive’ Facebook posts
- Facebook contractors reveal the horrors of moderating graphic content
Got five minutes? We’d love to hear from you. Help shape our journalism and be entered to win an Amazon gift card by filling out our 2019 reader survey.
H/T BuzzFeed News