Monday, October 21, 2013

About Those Robots Coming For Our Jobs

Kevin Drum has a piece today intelligently arguing that a lot of people underestimate the potential for "smart" machines to displace human labor in the next couple of decades, in particular by making flawed historical analogies to previous technological changes that had nothing to do with artificial intelligence. At the very least he's quite persuasive that one can't lump the possible forthcoming AI/"smart machine" revolution in with previous industrial revolutions. But that's not really what I want to talk about. I want to talk, rather briefly, about the following passage from the beginning of his piece. Here's how he describes Tyler Cowen's "average is over" thesis, with which he says he broadly agrees:
A small number of very smart people will do really well, while the broad middle class will end up with bleak, low-paying jobs—assuming they're lucky enough to have any jobs at all.
That, as a consequence of smart robots doing the work we're all accustomed to doing now. My problem here isn't really with the positive claim that robots will displace most/all human work. It's that, in such a world, we should view as "lucky" those who manage to still have a job. Why? "A job," in its current form, means spending countless hours of your life toiling away not for your own enjoyment but because someone out there values the stuff, tangible or otherwise, that you'll produce through your labors, and is therefore willing to give you money for it. And another way of saying "money" is "a social agreement to give you stuff and/or do stuff for you." The way the economy works circa 2013, as it has worked for multiple centuries now, is that everyone pretty much realizes that a ton of stuff needs to get done for all of us to enjoy prosperity, so we agree, through the social convention of money, to do stuff for other people who have demonstrated that they've done something to contribute to this prosperity. Or that they will do something to contribute in the future, and have gotten credit to reflect that expectation. Or that they have parents who contribute something.

Well, that's the unalloyed capitalist vision: I do stuff for you on the implicit premise of your handing me currency that at some point in the past or future you have done or will do something for someone else. Recently most societies around the world have started tinkering with that, suggesting that certain basic needs should maybe be given to people just because they need them, not because they've done anything for anyone. So we get social welfare policies, which capitalists hate because they undermine the basic "you only get something by doing something for someone else" incentive. But here's the thing: why do we need that incentive? Because there's a ton of stuff that needs to get done, i.e. that people need to do, and for various reasons persuasively detailed by political economists that stuff gets done a lot more efficiently if we don't all just do everything for ourselves, but rather each pick something of social value to specialize in and get paid for, and then buy what we need from the fruits of others' labors.

But what if we didn't need all that human labor to produce all the stuff we need? What if the extreme "smart machine" hypothesis comes true, and robots are able to do most of the work people currently do. And not just manufacturing, but much of what we currently consider services. That's the "smart" part, the idea that machines will be able to do complex intellectual tasks without human assistance. Well, in that scenario (assuming we fend off any possible robot rebellion), the thing where prosperity requires an amount of human labor such that basically everyone needs to spend a huge chunk of their life performing labor would stop being true, at least significantly. And if that happens, the basic logic for capitalist economics will be dead. We won't need to condition people's ability to get the stuff they want on whether they've done something to produce the stuff that other people want. The idea that people "deserve" only to consume as much as they can produce is a moral byproduct of capitalist economic logic, which states that it is efficient to let people consume only as much as they can produce because it maximizes production. If that stops being true we should sever the link. Depending on exactly how true it gets, that could mean a pretty goddamn robust basic income law, where everyone is just given an amount of money that will buy, say, a comfortable middle-class lifestyle. Or it could mean abolishing money and living in a socialist, Star Trek-style utopia, economically speaking at least.

Now, there would still be issues in that brave new world. The robot revolution is unlikely to change anything about the scarcity of the surface area of the earth or of usable energy, so conventional economic analysis will have some part to play in telling us how to structure things once they've eliminated scarcity in many other areas. But one thing seems clear. If robots take all our jobs, and are actually doing them as well or better than we were, we should all get to stop having the underlying premise of our lives be that we spend a third of them or more doing work just because someone else wants us to do it. That wouldn't mean no one would ever have a job, that we'd all just do things that would currently be described as "leisure." But in a sense even when we did something that would now look like a job, it would be leisure, because we'd be doing it for its own intrinsic value to us. Oh, and because we weren't being forced to do it on pain of starvation and homelessness by the social economic arrangement. And that. So if the robots come for our jobs, we should respond by just being okay not having jobs anymore, and then figuring out what to do with all our new free time. That could, potentially, be pretty awesome.

Also, see Matt Yglesias' blog for the occasional highly insightful consideration of what the economics of the technological future might look like. He tends to do a better job of acknowledging that changing technological conditions don't have to result in a path-of-least-resistance distortion of the current system, but might enable a totally different system altogether.

No comments:

Post a Comment