garlin gilchrist sxsw

University of Michigan School of Information

Former Obama social media manager doesn’t blame technology for rise of fake news

The problem is more human than it is bot.

 

Grace Speas

Tech

Posted on Mar 12, 2018   Updated on May 21, 2021, 10:02 pm CDT

Garlin Gilchrist managed social media for the Obama campaign in Washington state in 2008 and is now the executive director of the Center for Social Media Responsibility at the University of Michigan School of Information, which launched this March.

The center is opening up a dialogue on whether we can believe what we see online, so it is timely that Gilchrist and chief technologist of the center Aviv Ovadya will present a session titled “Infocalypse: The Future of Misinformation and How We Stop It” on Tuesday at SXSW.

Ovadya said in a Washington Post article that if society fails to take immediate action to protect our news and information ecosystem, we will careen toward an info apocalypse, a catastrophic failure of the marketplace of ideas.

Avoiding this failure will mainly result from efforts of humans, not technology, Gilchrist told the Daily Dot.

“Behind technology is always people, so we need to really do what we can to optimize those technology experiences for people and not just for technology’s sake,” Gilchrist said.

Gilchrist said his vision for combatting fake news will require people of “all industries, all walks of life, all sorts of expertise,” including media makers, consumers, social media companies, academia, and the public and private sector.

“I think what’s important is to have as many disciplines recognize how important this is,” Gilchrist said.

There is a misconception that fake news is more of a technology problem than a user behavior problem. Gilchrist said a recent study conducted by MIT proved this wrong.

“Contrary to conventional wisdom, robots accelerated the spread of true and false news at the same rate, implying that false news spreads more than the truth because humans, not robots, are more likely to spread it,” the study’s authors wrote.

Ryan Mac of BuzzFeed News will moderate the SXSW talk and said there is an educational element of eradicating false information and toxicity on social media. People need to be informed what to look for, especially as the technology gets better, Mac said.

“There’s certain examples of bots being employed to magnify messages, but there’s also actual people sharing this information and we tend to overlook that,” Mac said.

Mac said Ovadya, the technologist speaking with Gilchrist, has done extensive research into future applications of technology manipulation and has only really scratched the surface. For example, the concept of deep fakes, or video manipulation that places famous people’s faces onto bodies performing actions, started in porn production and is now moving into the political arena, Mac said.

“[Technology] gets better at deceiving you and it gets better at making you question what’s real and what’s not,” Mac said.

There’s also evolving audio manipulation software, which is making social media users more skeptical of genuine news, Mac said.

“Let’s say there’s an actual audio recording of a politician saying things they shouldn’t be saying,” Mac said. “Now, people know there’s technology out there that can mimic voices, for example. That can be turned around to question whether that tape is real.”

Gilchrist said the Center for Social Media Responsibility is in a good position to equip users and platforms with tools to distinguish the difference between fake and real. Since his involvement in Obama’s 2008 campaign, Gilchrist said he has learned these networks are about human connection at the end of the day.

“They’re networks of people,” Gilchrist said. “Remembering that and remembering that this is about people, how they connect, and how they share things with one another, that’s how we fix this.”

Share this article
*First Published: Mar 12, 2018, 7:16 pm CDT