Memory is just a story we believe.
I remember that when I was on a city bus, and so perhaps 8 years old, a friend showed me a "library card". I was amazed, but I knew that libraries were made for me.
When I saw the web ... No, not the web. It was Gopher. I read the minutes of a town meeting in New Zealand. I knew it was made for me. Alta Vista - same thing.
Siri too. It's slow, but I'm good with adjusting my pace and dialect. We've been in the post-AI world for over a decade, but Siri is the mind with a name.
A simple mind, to be sure. Even so, Kurzweil isn't as funny as he used to be; maybe Sir's children will be here before 2100 after all.
In the meantime, we get squeezed...
Artificial intelligence: Difference Engine: Luddite legacy | The Economist
... if the Luddite Fallacy (as it has become known in development economics) were true, we would all be out of work by now—as a result of the compounding effects of productivity. While technological progress may cause workers with out-dated skills to become redundant, the past two centuries have shown that the idea that increasing productivity leads axiomatically to widespread unemployment is nonsense...
[there is]... the disturbing thought that, sluggish business cycles aside, America's current employment woes stem from a precipitous and permanent change caused by not too little technological progress, but too much. The evidence is irrefutable that computerised automation, networks and artificial intelligence (AI)—including machine-learning, language-translation, and speech- and pattern-recognition software—are beginning to render many jobs simply obsolete....
... The argument against the Luddite Fallacy rests on two assumptions: one is that machines are tools used by workers to increase their productivity; the other is that the majority of workers are capable of becoming machine operators. What happens when these assumptions cease to apply—when machines are smart enough to become workers? In other words, when capital becomes labour. At that point, the Luddite Fallacy looks rather less fallacious.
This is what Jeremy Rifkin, a social critic, was driving at in his book, “The End of Work”, published in 1995. Though not the first to do so, Mr Rifkin argued prophetically that society was entering a new phase—one in which fewer and fewer workers would be needed to produce all the goods and services consumed. “In the years ahead,” he wrote, “more sophisticated software technologies are going to bring civilisation ever closer to a near-workerless world.”
...In 2009, Martin Ford, a software entrepreneur from Silicon Valley, noted in “The Lights in the Tunnel” that new occupations created by technology—web coders, mobile-phone salesmen, wind-turbine technicians and so on—represent a tiny fraction of employment... In his analysis, Mr Ford noted how technology and innovation improve productivity exponentially, while human consumption increases in a more linear fashion.... Mr Ford has identified over 50m jobs in America—nearly 40% of all employment—which, to a greater or lesser extent, could be performed by a piece of software running on a computer...
In their recent book, “Race Against the Machine”, Erik Brynjolfsson and Andrew McAfee from the Massachusetts Institute of Technology agree with Mr Ford's analysis—namely, that the jobs lost since the Great Recession are unlikely to return. They agree, too, that the brunt of the shake-out will be borne by middle-income knowledge workers, including those in the retail, legal and information industries...
Even in the near term, the US Labor Department predicts that the 17% of US workers in "office and administrative support" will be replaced by automation.
It's not only the winners of the 1st world birth lottery that are threatened. China's Foxconn (Taiwan based) employs about 1 million people. Many of them will be replaced by robots.
It's disruptive, but given time we could adjust. Today's AIs aren't tweaking the permeability of free space; there are still a few things we do better than they. We also have complementary cognitive biases; a neurotypical human with an AI in the pocket will do things few unaided humans can do. Perhaps even a 2045 AI will keep human pets for their unexpected insights. Either way, it's a job.
Perhaps more interestingly, a cognitively disabled human with a personal AI may be able to take on work that is now impossible.
Economically, of course, the productivity/consumption circuit has to close. AIs don't (yet) buy info-porn. If .1% of humans get 80% of revenue, then they'll be taxed at 90% marginal rates and the 99.9% will do subsidized labor. That's what we do for special needs adults now, and we're all special needs eventually.
So, given time, we can adjust. Problem is, we won't get time. We will need to adjust even as our world transforms exponentially. It could be tricky.
See also:
3 comments:
The employment rate is determined by macroeconomic policy. Wealth is determined by capital and technology.
When our economic leaders like the Fed and the federal government fail us, it is easy to see technology as a zero sum game where jobs are lost never to return.
But it is important to remember that there have been people from Ludd on down who have predicted that machines would replace labor. They have been wrong because there has always been something which had been a marginally less productive use of time because people were too poor to afford it. And now with productivity increase Y, people are rich enough and they want a hot meal without cooking themselves or a class on Yoga or a massage.
Humans are fundamentally insatiable. There is no level of wealth that we can obtain where we wouldn't want more. You and I are fabulously wealthy by any rational standard. We are richer than Croesus could have ever dreamt. And yet if you got a $50 a month raise, you certainly could spend it on something. Me too.
The insatiability of human beings has bad implications for the environment. But that also means that there will never be a time when people are out of work because we are too wealthy.
The only causes of mass unemployment are macroeconomic when there is a failure of market coordination. Stagflation (inflation has reached a point where people want physical stuff more than investment) or a liquidity trap (people want to save all at the same time and that is impossible) or deflation (the money supply is artificially constrained). None of these has anything in particular to do with technological level except that they only seem to occur in modern economies rather than Malthusian ones.
To be clear, I am not claiming that the authors of the books mentioned in the quote are Luddites. I am claiming that they do not realize that the singularity has happened. It was called the industrial revolution. The machines came and took all our jobs.
We are living in whitewater times right now and have been for a long time. They see the past thirty years as a time of stability and look forward to a time of rapid change. But the change is here, has been here, and will continue to be here.
We have become desensitized to the motion and mistake it for the calm before the storm.
I liked the way DeLong puts it (though not lately) - the Singularity is in the past (and perhaps more than one).
Singularity purists though tend to assume "AS" (artificial sentience), and we're not quite there. At least as far as I know. The rest is just whitewater warmup.
My (limited) understanding of macroeconomic policy is that it can set overall employment, but not income distribution (inequality). In the absence of a minimum wage the base income may be more like serfdom than we're used to. With a minimum wage you get unemployment. With wage subsidies you get less serfdom, more employment, but higher taxes for the wealthy.
Post a Comment