Via

Don’t fear the robot economy

The debate over an automated future is more complicated than you think.

Photo of [email protected]

[email protected]

Article Lead Image

BY GERALD HUFF

Featured Video

On March 2, 2015, there was an Intelligence Squared debate held in London based on the proposition “Be afraid, very afraid: the robots are coming and they will destroy our livelihoods.” Walter Isaacson, the president and CEO of the Aspen Institute, and Pippa Malmgren, founder of H Robotics, argued against the motion. 

Since their arguments included the most common objections to the risk of “technological unemployment,” it’s instructive to take them one by one and see how they line up with the facts and against our emerging technological innovations. (For the purposes of this article, I’ll use data for the United States as a representative advanced capitalist economy.)

Let’s start with Isaacson’s prepared statement. He begins:

Advertisement

Since my distinguished opponent, Mr. Keene, started with the Industrial Revolution, and said that this new revolution would be as disruptive as the Industrial Revolution, let me begin by reminding you that the Industrial Revolution wasn’t all that bad. At the end of it, we had more jobs, not fewer jobs. 

Technology can indeed be disruptive. It can disrupt certain jobs, as Mr. Keene has said, but it always has, and I submit always will produce more jobs, because it produces more wealth, more personal income, more per capita wealth, and thus, more things that we can make and buy.

The first argument against technological unemployment is that we have been having technological innovation for hundreds of years, and the simple fact is we have more people employed in jobs now than ever before. The mechanism for this is fairly clear, as described by Isaacson. As technology automates a sector (like agriculture, then manufacturing), its products become cheaper. This creates more disposable income for people to spend on new products and services, so jobs are created in new sectors. 

With a rising population generating increasing demand for goods and services that require human labor, it follows that there will be more and more jobs created. For example, from 1760 to 1850 (i.e., the Industrial Revolution), the population in the United States grew from 1.6 million to 23.2 million. So there were more jobs because there were more people creating demand in the economy for things that still required significant human labor to produce, transport, sell, and repair.

What is different about the technologies emerging now from academia and tech companies large and small is the extent to which they can substitute for or eliminate jobs that previously only humans could do. Over the course of thousands of years, human brawn was replaced by animal power, then wind and water power, then steam, internal combustion and electric motors. But the human brain and human hands—with their capabilities to perceive, move in and manipulate unstructured environments, process information, make decisions, and communicate with other people—had no substitute. 

Advertisement

The technologies emerging today—artificial intelligence fed by big data and the internet of things and robotics made practical by cheap sensors and massive processing power—change the equation. Many of the tasks that simply had to be done by humans will in the coming decades fall within the capabilities of these emerging technologies.

When Isaacson says “it always has, and I submit always will produce more jobs, because it produces…more things that we can make and buy,” he is falling into the Labor Content Fallacy. Without repeating the entire argument, there is no law of economics that says a product or service must require human labor. The simplest example is a digital download of a song or game, which has essentially zero marginal labor content. In the coming decades, for the first time in history, we will be able to “make and buy” a huge variety of goods and services without the need to employ people. The historical correlation between more human jobs due to increased demand for goods and services from a rising population will be broken.

Isaacson continued:

Every succeeding generation has had a new wave of worry, dire warnings that jobs are being threatened. But in each one of these cases, they turned out to be wrong.

I was at Time magazine, and I went back to look 50 years ago, 50 years ago to this day, almost. Time Magazine did a cover, just like the line that we’re looking at. It said, “The robots are coming, automation will take your job.” 

It quoted a rather famous economic historian, Robert Heilbroner. Most of you have heard of him. He says, “The new technology is threatening a whole new group of skills.” 

He sounded just like Mr. Keene. He said, “The calculating, the remembering, the comparing, the OK’ing skills, the preserve of the office worker will go away, and machines will invade society to make that type of human labor redundant.” 

Guess what? It’s been 50 years, and since then, soon after he wrote that, we had the greatest leaps of technology you can imagine. The Internet, the personal computer and the microprocessor all come into play, but nobody has been rendered redundant.

Advertisement

This is a perfectly fair criticism of those who claim robots are coming for our jobs—these alarms have been raised before, back to Keynes in the 1930s and the Triple Revolution letter and breathless Time magazine articles in the 1960s. Anyone making the case for increasing technological unemployment in the coming decades must explain why “this time is different” from those other times.

I believe this time is different because the capabilities of the hardware and software are different and the nature of human work is different. Manufacturing is only about 9 percent of employment in the U.S. The vast majority of jobs now are in the retail, service, and knowledge economy sectors. Until now, technology has been a complement to people in those sectors but could not replace them. 

However, ubiquitous connectivity, advancing computer vision, learning, reasoning, and language capabilities, and the vast digitization of economic activity are beginning to enable direct substitution for and elimination of human labor.

We are already seeing the first prototype self-driving cars and trucks. While working through the technical, legal, and regulatory issues may take 10–15 years, there are millions of people who drive for a living who will no longer have a job. It will become increasingly easy for people to interact with machines and for software agents to act on their behalf, eliminating many front-line customer service roles. Knowledge workers who enter, transmit, process, or analyze data can be replaced by AI programs. Significant portions of what lawyers and doctors do can be replaced by eDiscovery and eDiagnosis software.

Advertisement

The key technology underlying this is machine learning, which enables computers to learn from the vast quantities of digital data now and increasingly available, without a human needing to program them step by step. 

While this technology has been around for decades, only in the last few years have we had access to huge digital databases for training machine learning algorithms, and there have been breakthroughs in designing new deep learning approaches that rival human perception. Machines that can learn is something fundamentally different about this time vs. the past.

Isaacson continued:

There are more jobs, there are more types of jobs now than there were 50 years ago. As Mark Andreessen, the person who invented the web browser said, “Just as most of us today have jobs that weren’t invented 100 years ago, the same will be true 100 years from now.”

Advertisement

This is a very common statement that has been repeated all over the internet, but it is unfortunately completely untrue. If you examine the Bureau of Labor Statistics breakdown of occupations, what you find is that only 20 percent of today’s occupations did not exist one hundred years ago, and more importantly, 90 percent of all U.S. employment today is in fact in occupations that did exist 100 years ago. 

While some new jobs have been created, mostly in the computer/Internet realm, they employ very few people. People use anecdotal jobs like “Web designer” and “social media marketing coordinator” and say “what farmer in 1915 could have imagined that job?” They don’t use the actual top occupations in the U.S.—like retail salesperson, cashier, food prep/servers, office clerks, nurses, customer service, waiters and waitresses, hand laborers, secretaries, janitors, managers, stock clerks and truck drivers because of course the farmer would have not have been the least surprised that those jobs existed. (Note that those few listed jobs are 25 percent of U.S. employment.)

The fact is that we have not invented all kinds of new jobs that employ masses of people. For the last 100 years, we have shifted people from agriculture and manufacturing into jobs that already existed, but that the machines were not yet capable of doing. And the machines are becoming more and more capable of performing those jobs.

Let me give some examples. There’s the app economy. It began in 2008, when Steve Jobs was convinced by other members of his management team to allow outsiders to create apps for the iPhone, and then, the iPad. 

The app store alone, since then, has created 627,000 jobs. The app economy last year was $100 billion, far surpassing the film industry. This is an industry that did not exist seven years ago.

Advertisement

The figure of 600,000 jobs in the app economy has unfortunately also spread over the Tnternet. It comes from a single analysis of help wanted ads with a highly suspect methodology. The authors of the study looked at open positions for mobile app developers and multiplied that count by the ratio of other kinds of software developers to the open positions for their jobs. So if companies generally had seven existing software developers and one open position for a software developer, the ratio was seven to one. In this study, any mobile app open position was therefore also assumed to represent seven existing mobile app jobs. 

This makes no sense. Mobile application development is quite new compared to existing software development teams. Companies are generally hiring their first developers—or supplementing small existing teams. There was also no accounting for software developers simply shifting from Web or other kinds of development into mobile development as those older areas became more productive and need fewer headcount.

Another important fact about the “app economy” is that it is not very democratic. Like many digital products sold primarily through user recommendation, it is subject to winner-take-all dynamics. A very small number of apps do extraordinarily well, while the rest in the “long tail” earn almost nothing. 

Winner-take-all markets are a very precarious place to attempt to make money. It is hard to recommend the app economy to a truck driver used to a steady income that pays the mortgage and the bills.

Advertisement

Other advances in technology have helped facilitate new forms of work, such as the sharing economy, where people both here, New York, New Orleans can drive, be part of Uber, be part of AirBnB, be part of Lyft.

Likewise, online marketplaces such as Amazon and eBay have enabled the rise of artisans, makers, the type of people who existed in the pre-industrial age. People who had creative and things, and then, could have connections to customers around the world. If you create a book or a song, now, you have new ways to publish, as Pippa did with her own book. New ways to distribute.

If you dream up a new specialty, this world allows you to find customers and provide a service, just as if you were an artisan or a service maker in the pre-industrial revolution. It even helps long established companies find markets all over the globe. There’s a more fundamental shift going on in the economy that’s also good.

We’re creating an on demand, or on tap economy. If you need a housekeeper, or a handyman, you don’t have to go to a particular firm to get it. It can be done on tap.

If new technologies reduce the total number of jobs, we would all be out of work now. I doubt that Mr. Keane, nor I, nor you can guess what wondrous jobs technology will bring to our great grandchildren.

Unfortunately, these are also not good replacements for a full time job. Marketplaces like Uber, AirBnB, eBay, and Etsy certainly do enable some people to supplement their income. But very few people actually make a living from these marketplaces. It is true that distribution of content like songs and books is certainly far, far easier than it used to be. But because of the vast quantity of content on those networks and winner-take-all dynamics, very few people can also earn a living from content creation and distribution. 

As for the on demand or on tap economy, this does not create any new jobs, it just makes it slightly more efficient for customers to find existing handymen or housekeepers.

Isaacson’s final statement is a straw man. No one is making a categorical statement that “new technologies reduce the total number of jobs.” The statement at issue is that the particular technologies under development right now—in AI, robotics, 3D printing, advanced personalized health care, etc.—will have a significant impact on jobs that employ masses of people over the next few decades. There will no doubt be “wondrous jobs” created for our great grandchildren—there is simply no evidence, however, that those jobs will employ masses of people.

Advertisement

Pippa Malmgren’s prepared statement had some additional arguments.

I take you to the case of Jenny Doan, who runs a fantastic quilting company in Missouri. This is a woman who is getting 30 million YouTube hits for showing people how to make quilts.

A very old-fashioned activity, but not possible without computers, without the artificial intelligence to make the pieces the right size, to sew them together quickly. She employs 85 people. 300 people a day come from around the world to go to her shop in the middle of Missouri.

To that end, I have to say, as someone who’s actually employing people in this business, what there’s a shortage of out there is not intellectual capability, or high levels of education and training.
There’s a shortage of people who have the practical skills, who know how to make an engine work. Who know how to do electrical wiring.

We need more people who can do these things, because they are the ones, the technology empowers to do something. Robotics and artificial intelligence won’t destroy jobs, they are the tools by which we will build the jobs we are going to have tomorrow.

They will be different than the ones we do today, and thank goodness, because that will be much more interesting, exciting, and draw on our capabilities much more than anything else we can imagine.

It is not exactly clear how the anecdote of Jenny Doan and people “who have practical skills” relates to Malmgren’s closing thought. Jenny Doan’s quilters and the people who “know how to make an engine work” do not have the “different jobs of tomorrow” somehow related to AI and robotics. In response to an audience question, Malmgren expressed another commonly held belief:

There’s a global shortage right now, for example, of welders. They’re hiring people in their 70’s to work with modern robotics because welding still is a skill. A robot can’t replicate what a human knows about this. The human nuance and knowledge of how things work is still essential in the application of robotics.

Advertisement

There may be job openings for welders, but employment in welding and cutting has declined slightly from 390K in 1990 to 370K in 2014. Welding robots with vision will be coming soon and replacing even more of those jobs. Welding and other trades would not seem to be a high growth refuge for service and knowledge sector workers displaced by AI and robotics.

In the ongoing Q&A, Isaacson argued:

I do think that we’ve always thought AI was coming, ever since about 1956, and it always is a little bit harder than we think. So whether it is domain awareness or creativity, the humans have a way of cognitive thinking that is so fundamentally different, in its carbon-based, wetware system of our brain, than in the digital silicon system, that artificial intelligence remains a mirage and it’s not about to change everything. Our creativity, our humanism, will always be part of the party.

The fact that computers operate differently than human brains tells you nothing about whether they will be able to perform tasks that the human brain performs, any more than observing that airplanes do not flap their wings the way birds do tells you whether they can fly. The fact that self-driving cars use high res digital maps, LIDAR, cameras, and ultrasonic sensors instead of a “wetware brain” and eyes does not mean they can’t exist and replace millions of human drivers. A very large percentage of jobs do not require extensive creativity as they perform routine cognitive or physical tasks. 

Advertisement

An emerging model of training AIs can also deal with the “unexpected” or “exceptional” cases in these jobs. A small contingent of human workers remain on the job even as AIs take on the vast majority of the work. When the AI comes across something it hasn’t seen before, it refers it to a human and observes what the human does and learns from it so it can handle the case the next time.

As for what jobs will be safe in the future, Isaacson said:

If you’re going to be looking for a job in the future, be creative. What will robots/computers not do that we can do? They work by algorithm. Almost by definition they are not creative. They don’t think out of the box, they don’t think imaginatively. Our minds are cognitive, so we supply Intelligence Squared debaters. No robot is going to replace this panel but nor will it replace the Pixar people and the animation people and the people doing the creativity. And if you learn to code and connect the arts and the sciences that was Steve Jobs’ great advice.

Actually, computers are already being programmed to be creative, in the sense that they are given underlying rules of physics, chemistry, music, or cooking and then they produce innovative hypotheses, compositions, or combinations of ingredients that in fact no human would have thought of. In a way, computers are like children, and they will generate many “out of the box” ideas because they are quite free of accumulated human “boxes” like conventional wisdom and prevailing paradigms.

Advertisement

While many jobs have a creative component, it’s also important to note that all of the creative occupations in the U.S. economy (e.g., artists, writers, editors, musicians, photographers, actors, dancers, designers, choreographers, journalists, marketers/public relations, etc.) only account for about 1 percent of employment. There has never been an economy filled with only highly creative or highly skilled people.

Malmgren picked up on the theme of human creativity:

Look at some of the greatest innovations of our time…[describes MIT container farming]. … You cannot ask a computer to create that. Human beings create that. Computers can execute based on algorithms once you’ve decided what you need executed. But to solve the problems our society faces, that still is about human ingenuity.

Once again, the arguments for what computers can’t do ignores the reality of real people in actual occupations. The highly talented scientists, researchers, and engineers that “invent the future” are a very small fraction of the workforce. In the U.S., there are about 700,000 scientists and three million engineers, or less than 2.5 percent of workers in total. The vast majority of workers are not involved in the “ingenuity” phase of problem solving; they are involved in the “execution” phase. And that is where AI and robotics are going to become more and more capable.

Advertisement

In summary, these common arguments against the risk of technological unemployment ignore what is different about the technologies under development and they suggest solutions (“be creative,” “become a welder,” “”develop an app”) that are unrealistic ways for tens of millions of displaced workers to continue to earn an income.

Going back to the debate proposition, however, I don’t believe we need to be afraid. I think this technological progress can be an amazing achievement for humanity, if we adjust our mindset appropriately. With machines doing most of the work we will need a new social contract, new mechanisms like a basic income guarantee, and new conceptions of a meaningful life. I believe we should embrace a future where humans can be liberated to follow their passions, reach their full potential, and contribute to their communities.

Some people will find the notion of people not selling their labor for income incomprehensible or impossible. But the notion of a “job” working for a stranger at a “company” is a very recent invention. And humans are supremely adaptable. After tens of thousands of years living in small villages of hundreds of people where everyone was related, who could have imagined cities of ten million strangers living side by side? Ideas like slavery, commonly accepted for millennia, were eventually discarded. It’s a challenge, but cast your mind ahead 100 years. 

Won’t our descendants look back and say, “I can’t believe that for hundreds of years people spent most of their lives doing all that boring work that machines do for us now. They had no time to actually enjoy life”?

Advertisement

Gerald Huff is the principal software engineer at Tesla Motors.

This article was originally featured on Medium and reposted with permission.

Photo via CJ Isherwood/Flickr (CC BY SA 2.0)

 
The Daily Dot