It’s not as complicated as you might think.
By now you’ve probably heard a phrase similar to: “An algorithm controls your news feed,” at some point in the last 10 years or maybe even the last 10 hours. It’s the secret sauce that makes services such as Facebook decide what to show you based on the things you like, people you talk to, news you read, and countless amounts of other data collected to build up your social identity.
It’s more simple than you think
It sounds a bit complicated, but algorithms are not magical things that live in our computers and dictate our digital lives at random. They are created by people and implemented to serve a purpose. Sometimes, like in Facebook’s case, because algorithms are proprietary information that companies don’t want to share with the general public, we don’t really know much about them.
“An algorithm, written in code, does all its work invisibly,” software developer Danilo Campos explained in an email. “The implementation of an algorithm is hidden from view. In some cases, this is very complete and deliberate. For security or competitive reasons, code may be obfuscated to prevent reverse engineering. Even when an algorithm is implemented in a publicly-readable form, though, it’s still being interpreted at lightning-fast speeds largely behind the scenes.”
In order to understand how the tools we use affect our lives and behavior, it’s helpful to learn a bit about the building blocks—how the digital sausage is made.
People associate algorithms with technology and math. The software and hardware we use every day make use of algorithms, whether they’re distributing ads and results in Google or executing activity in our processors. But algorithms have existed for centuries—4,000 years ago, the Babylonians composed algorithms on clay tablets.
Okay, but that’s still Babylonian math; let’s break this down even further. Chances are you learned your first algorithm before you could multiply double digits in your head, and it had nothing to do with mathematics.
In second grade, my teacher asked us to explain how to make a peanut butter and jelly sandwich to an alien who had never been to Earth. We couldn’t just tell the alien to slap two pieces of bread together; we had to break it down step by step, creating a list for the alien to follow. (You must unscrew the top of the peanut butter in order to get it out of the jar.) Without realizing it, we created an algorithm.
An algorithm is a set of rules that something follows to execute a task. It could be as simple as following a recipe, or a piece of software serving up bits of personalized data on the web. The ones modifying our news feeds, deciding which Uber driver gets alerted when we need a ride, and recognizing patterns in information to filter spam are more complicated than making a sandwich, but it all comes down to a procedure that accepts inputs and distributes outputs.
Inputs on social networks and other activity around the web could range from your IP address, the size of your network, to your location and timezone, or your search history. Outputs could be display ads, new friend suggestions, or what photo to show you first on Instagram.
Demystifying these concepts can help consumers realize that when companies obfuscate the programming and implementation of algorithms and other tech tools, it also blurs biases that go into building them. Campos said factors like the data selected as inputs and how data is weighed could influence business and policy decisions. But these variables are unknown to most consumers.
“An algorithm can be the automation of policy—while entirely concealing that policy at the same time,” he said. “Consumers who are unaware of this automated influence will be subject to any number of consequences, from being persuaded to buy things to missing out on critical information deemed unimportant by an algorithm or its designers.”
How algorithms affect your life
Algorithms are as prone to bias as the people who program them. Take, for instance, the groundbreaking report by ProPublica that discovered crime-prediction algorithms used in the U.S. legal system are biased against African Americans.
Facebook’s news feed algorithm has the power to influence elections, create filter-bubbles from select information that it thinks you’d be interested in reading, and manipulate publishers’ traffic the company sends to websites that post on the service. It’s the most high-profile example of how algorithms affect our lives, because many of us use Facebook every single day, and the media, largely impacted by changes to the algorithm, will always report even the slightest changes. And when Facebook changes it, the effects can be felt in industries that rely on it, from media to small businesses, to advertising.
Which is why when Facebook modifies its algorithm, people freak out. The algorithms are so powerful they can literally jeopardize people’s livelihoods. Though Facebook has begun to use humans in place of algorithms for things like event curation, because, despite everything its machines might know about you, humans still provide a more personal touch.
When you use personalized apps and services, you’re providing the inputs that will determine outputs. It’s why people’s Google Search results will look different when they’re looking up things in different parts of the world, or why the ads that follow you across the web are different from your friends’. You never really see the complexities at work, simply the results.
For many of us, algorithms exist in the abstract, embodying futurist and science fiction writer Arthur C. Clarke’s adage that “Any sufficiently advanced technology is indistinguishable from magic.” Things work, but we don’t really think about why or how. But algorithms are not magic—they’re math and science and rules and logic; human programming is responsible for the information we see and apps we use each day.