- Majority of threats made since El Paso and Dayton shootings have been made online Thursday 8:00 PM
- Miley Cyrus tweets about cheating allegations and penis cake drama Thursday 6:32 PM
- ‘The Dark Crystal: Age of Resistance’ dazzles with a timely tale Thursday 6:00 PM
- The DOJ emailed a white nationalist blog post to immigration judges Thursday 5:31 PM
- The Amazon rainforest is on fire–and people are using memes to cope Thursday 4:11 PM
- Microsoft contractors listened in on Xbox users Thursday 2:15 PM
- Anti-vaxxer assaults pro-vaccine lawmaker on Facebook Live (updated) Thursday 2:15 PM
- Oreos licked by singer Lewis Capaldi are being auctioned off on eBay Thursday 1:54 PM
- Zach Braff predicted Sean Spicer would be on ‘Dancing With the Stars’ 2 years ago Thursday 1:38 PM
- NYPD sergeant who watched Eric Garner die punished with lost vacation days Thursday 1:27 PM
- Brie Larson haters have a meltdown over a joke about Thor’s hammer Thursday 1:26 PM
- This comedian attempted to make fun of women on Twitter—and it did not go over well Thursday 1:04 PM
- Logan Paul wants to help the Amazon rainforest Thursday 12:36 PM
- Nutaku announces redesign and filters for LGBTQ porn games (updated) Thursday 12:25 PM
- This video of dozens of inflatable mattresses taking off in the wind is perfect Thursday 12:20 PM
The Web felt very different 15 years ago, when I founded Drupal, an open source tool for building websites. Just 7 percent of the population had Internet access, there were only around 20 million websites, and Google was a small, private company. Facebook, Twitter, and other household tech names were years away from being founded. In these early days, the Web felt like a free space that belonged to everyone. No one company dominated as an access point or controlled what users saw. This is what I call the “open Web.”
But the Internet has changed drastically over the last decade. It’s become a more closed Web. Rather than a decentralized and open landscape, many people today primarily interact with a handful of large platform companies online, such as Google or Facebook. To many users, Facebook and Google aren’t part of the Internet—they are the Internet.
To many users, Facebook and Google aren’t part of the Internet—they are the Internet.
I worry that some of these platforms will make us lose the original integrity and freedom of the open Web. While the closed Web has succeeded in ease-of-use and reach, it raises a lot of ethical questions about how much control individuals have over their own experiences. And, as people generate data from more and more devices and interactions, this lack of control could get very personal, very quickly, without anyone’s consent. So I’ve thought through a few potential ideas to bring back the good things about the open Web. These ideas are by no means comprehensive; I believe we need to try a variety of approaches before we find one that really works.
It’s undeniable that companies like Google and Facebook have made the Web much easier to use and helped bring billions online. They’ve provided a forum for people to connect and share information, and they’ve had a huge impact on human rights and civil liberties. These are many things for which we should applaud them.
But their scale is also concerning. For example, Chinese messaging service Wechat (which is somewhat like Twitter) recently used its popularity to limit market choice. The company banned access to Uber to drive more business to their own ride-hailing service. Meanwhile, Facebook engineered limited web access in developing economies with its Free Basics service. Touted in India and other emerging markets as a solution to help underserved citizens come online, Free Basics allows viewers access to only a handful of pre-approved websites (including, of course, Facebook). India recently banned Free Basics and similar services, claiming that these restricted Web offerings violated the essential rules of net neutrality.
Beyond market control, the algorithms powering these platforms can wade into murky waters. According to a recent study from the American Institute for Behavioral Research and Technology, information displayed in Google’s search engines could shift voting preferences for undecided voters by 20 percent or more—all without their knowledge. Considering how narrow the results of many elections can become, this margin is significant. In many ways, Google controls what information people see, and any bias, intentional or not, has a potential impact on society.
In the future, data and algorithms will power even more grave decisions. For example, code will decide whether a self-driving car stops for an oncoming bus or runs into pedestrians.
It’s possible that we’re reaching the point where we need oversight for consumer-facing algorithms.
It’s possible that we’re reaching the point where we need oversight for consumer-facing algorithms. Perhaps it’s time to consider creating an oversight committee. Similar to how the FDA monitors the quality and safety of food and drugs, this regulatory body could audit algorithms. Recently, I spoke at Harvard’s Berkman Center for the Internet and Society, where attendees also suggested a global “Consumer Reports” style organization that would “review” the results of different company’s algorithms, giving consumers more choice and transparency.
Gaining control of our data
But algorithmic oversight is not enough. In numbers by the billions, people are using free and convenient services, often without a clear understanding of how and where their data is being used. Many times, this data is shared and exchanged between services, to the point where people don’t know what’s safe anymore. It’s an unfair trade-off.
Imagine a way to manage how our information is used across the entire Web, not just within a single platform.
I believe that consumers should have some level of control over how their data is shared with external sites and services; in fact, they should be able to opt into nearly everything they share if they want to. If a consumer wants to share her shoe size and color preferences with every shopping website, her experience with the web could become more personal, with her consent. Imagine a way to manage how our information is used across the entire Web, not just within a single platform. That sort of power in the hands of the people could help the open Web gain an edge on the hyper-personalized, easy-to-use “closed” Web.
Decentralizing power and control
In order for a consumer-based, opt-in data sharing system described above to work, the entire Web needs to unite around a series of common standards. This idea in and of itself is daunting, but some information-sharing standards like OAuth have shown us that it can be done. People want the web to be convenient and easy-to-use. Website creators want to be discovered. We need to find a way to match user preferences and desires with information throughout the open Web. I believe that collaboration and open standards could be a great way to decentralize power and control on the Web.
Why does this matter?
The Web will only expand into more aspects of our lives. It will continue to change every industry, every company, and every life on the planet. The Web we build today will be the foundation for generations to come. It’s crucial we get this right. Do we want the experiences of the next billion Web users to be defined by open values of transparency and choice, or the siloed and opaque convenience of the walled garden giants dominating today?
I believe we can achieve a balance between companies’ ability to grow, profit and innovate, while still championing consumer privacy, freedom and choice. Thinking critically and acting now will ensure the Web’s open future for everyone.
Photo via gualtiero/Flickr (CC BY 2.0)