Blackboard

Why we need a Web that forgives and forgets

Shares

By CHRIS R. ALBON

We've all said things we regret or don’t want in the public eye. We've all been been unreasonably angry with a family member, frustrated with our job, or on a bad day just plain cruel to a stranger. It is part of being human. Offline, these minor infractions are quickly forgotten, brushed off as “they’re having a bad day.” The problem is if they appear online, they can stay with us forever.

The simple fact is that businesses are so motivated to capture all our data that increasingly anything we say or post online is recorded by default. And what’s worse, it’s not only the things we put online, but the things that other people put online about us. With Facebook's photo tagging feature, a photo you never knew was taken—maybe at bar at 2am or on your couch after a long night of drinking—can become part of your digital reputation. The Web is quickly coming to the point that everything you say or do online can be used against you in the court of public opinion. Some say we could be looking at the end of forgetting, where the past can be accessed with the click of a mouse.

Over the years, people have adapted, becoming for vigilant about what they post or have posted about them online. We have become increasingly experienced at online self-censorship. But we shouldn’t need to. We deserve something better: a Web that forgets.

As Viktor Mayer-Schönberger argues in Delete, forgetting is an important element of human nature. We need it for all sorts of important things, “from the ability to make sound decisions unencumbered by the past to the possibility of second chances.”

For most of its existence, Twitter has been, if not by design, by de facto ephemeral. The service's limited search feature made it almost impossible to find a specific tweet from someone from more than a few months back. I have been on Twitter for four years; far from complaining, I enjoyed this feature, because it released me from having a constant voice inside my head asking, will this tweet come back to haunt me five years from now? While Twitter might have considered it a technical limitation, I considered it an essential feature. Those days, however, are now over (download all your tweets here). 

Thankfully, other platforms are emerging to fill that need. A small number of companies are building tools that forget what users did in them. They are building an ephemeral web. Snapchat is probably the best example of this type of tool: the smartphone allows users to send text, photos, and/or video that self destructs after a predetermined period of time. It forgets what you sent. The up-and-coming messaging app has been largely popular amongst teenagers wanting to share insults or embarrassing photos. I wish it was popular with all of us.

We deserve an ephemeral Web; one with communications unburdened by permanence. We deserve to have the Web—at least some of the time—forgive and forget. Hopefully, applications like Snapchat are just the beginning of that ephemeral Web. With luck, in time there will be a whole class of applications from email applications to microblogging platforms whose killer app is that they capture nothing, remember nothing.

There will always be a place for the Facebooks and Twitters of the world. They provide an important identity layer on the Internet, and a place to connect with friends. I store all of my family photos on Google+ precisely because it never forgets.

But in the digital world’s quest to capture everything, we have lost something. Ephemeral tools like Snapchat provide a much needed outlet for the thing we want to share but that we don’t want to be immortalized on the Web forever. They are an acknowledgement of the nuances of human communication. Some things we say we mean forever; others we mean to be forgotten.

Chris R. Albon is a political scientist and writer on the global politics of science and technology. Presently, Chris leads the Governance Project at FrontlineSMS. Prior to FrontlineSMS, Chris earned a Ph.D. in Political Science from the University of California, Davis.
 
Photo by Bigstock.com