The panel, “Level Up: Overcoming Harassment in Games,” took place at the Austin Hyatt Regency, an out-of-the-way location that may have contributed to a smaller crowd for what was—online, at least—one of the Interactive Festival’s most talked-about events.
“Level Up” became one of two focal points for a conversation about harassment and tech culture last fall after SXSW attempted to cancel it, along with another panel run by panelists affiliated with Gamergate. SXSW organizers cited security concerns after hearing from angry conference attendees and supporters on both sides of the subculture war. Ironically, canceling “Level Up” seemed to underscore the point that many women have been making about online harassment: that the goal is to prevent women from speaking about their own experiences.
In response to the controversy, SXSW reinstated both panels and created the harassment summit, an all-day event with a broader focus on harassment across the Internet. The “Level Up” panel was reinstated as part of the summit, while the other panel, “SavePoint: A Discussion on the Gaming Community,” was scheduled for a Tuesday time slot.
Panelists included Caroline Sinders, a design researcher for IBM’s Watson AI, and frequent Gamergate target Randi Harper, who came to widespread attention after creating the Gamergate auto-blocker during the peak of the movement in 2014. Harper later founded the Online Abuse Prevention Initiative, which consults with tech companies to implement online abuse prevention techniques. In addition to Sinders and Harper, academic and gaming journalist Katherine Cross, and moderator Kami Huyse, founder of anti-cyberbullying non-profit Civilination, joined the conversation.
Huyse opened the panel by curtailing any anticipated drama around Gamergate. “Gamergate is merely a symptom of the real problem,” said Huyse, noting that the panel would focus on addressing online harassment from “design, technology, and community points of view.”
Huyse stated the goal of the discussion was to “better design social media platforms to create respectful places while still allowing for challenging points of view.”
The panel conversation dealt with the positives and negatives of gamification as an application tool and its application to online harassment. “Gamification is often used as a substitute for moral education,” said Cross, noting that gamification can often encourage people to do the right thing without allowing them to understand why it’s the right choice.
Cross felt that gamification has turned online harassment into a game. “Twitter is one of the most addicting online games ever devised in many ways,” Cross said, noting that the platform’s inherent structure can give you a rush by promoting the “aggressive, bigoted, bumper sticker slogan that you make out of your tweet.”
“We’re quite literally the dragons that have to be slain,” Cross said. “That’s no way to approach another human being.”
She commented that harassers see themselves as the game players and heroes and their targets as NPCs (non-playable characters).
Cross pointed out that, culturally, many physical spaces have social architecture inherent in their structures to prevent abuse. The difference is that the online world often lacks those same structures. Sinders compared Twitter to a giant real-life open space, advocating for the “idea of semi-private spaces in very public networks,” and for “spaces that have agency for users to define their experiences.”
As an example of harmful online architecture, Harper mentioned the Reddit practice of downvoting, which she stated can often reinforce mob justice by creating an echo chamber, as redditors downvote comments they disagree with.
Sinders noted that each subculture and social community has its own language and connotations around its specific tools. Speaking of Facebook “likes,” she noted that Facebook emojis touched on the idea of different communities needing their own nuances and forms of expression. “There are so many other kinds of emotions we experience online.”
Cross pointed out that if “even better community management, taking harassment communities seriously, and recognizing that Facebook’s inherent social model—particularly its insistence on real names and “de-anonymizing users”—can also be harmful.
“Anonymity is not the problem,” Cross said. “Platforms like Yik-Yak need stronger moderation. You can’t just throw people into a social environment and expect them to organize themselves … that happens irrespective of whether the users are anonymous.”
Sinders emphasized the importance of allowing people to have ownership over their own spaces and the tools they use to build their own spaces.
Harper disagreed that Twitter was made up of real communities, as opposed to Reddit, where communities are literal in the form of subreddits. But she also noted that brigading—the practice of invading another subreddit in order to promote certain points of view—skewed the tone of discussion on the site. Yet Reddit’s upvote system views votes from outsiders to the community and longtime members of the community equally. “That’s not really a good form of design. How do you know who your community members are? You don’t.”
Cross mentioned Riot Games and League of Legends as an example of a company that found a way to ignite the “silent majority” of gamers in order to shape and mold the community positively. “That system demonstrates that when you treat players like citizens they will act like citizens.”
“You have to make community something that you build from the ground up along with the game,” Cross said. She also noted that the same principle applies to social media. “[Twitter’s] system was never designed to accommodate any of this,” she said, describing the subsequent design changes as “rear-guard action.”
Huyse pointed out Tim Berners-Lee’s recent speech at Sundance, in which he called on platforms to step up and lead the charge in implementing cultural change online. Cross praised the comment moderation tool Civil Comments as a way of implementing possible abuse prevention, while Harper cited a number of small changes platforms like Twitter could make in order to give users more control and ownership over their feeds.
Sinders proposed areas for change including platform accessibility, privacy settings, the option to turn off retweets and Twitter embeds, Twitter tagging, and digital design that allows other social filters beyond merely public and private.
“Going private is one of the most public things you can do when you’re being harassed,” she pointed out.
Again and again, the panelists emphasized the need for fine-grained micro-level user control over social networks and platforms.
Harper noted the high overlap between patterns of harassment and patterns of spam. She expressed hope that corporations will ultimately respond to harassment the same way they eventually responded to spam—with the awareness that not every form of communication online is worth heeding.
Photo by Aja Romano