Memory is just a story we believe.
I remember that when I was on a city bus, and so perhaps 8 years old, a friend showed me a "library card". I was amazed, but I knew that libraries were made for me.
When I saw the web ... No, not the web. It was Gopher. I read the minutes of a town meeting in New Zealand. I knew it was made for me. Alta Vista - same thing.
Siri too. It's slow, but I'm good with adjusting my pace and dialect. We've been in the post-AI world for over a decade, but Siri is the mind with a name.
A simple mind, to be sure. Even so, Kurzweil isn't as funny as he used to be; maybe Sir's children will be here before 2100 after all.
In the meantime, we get squeezed...
Artificial intelligence: Difference Engine: Luddite legacy | The Economist
... if the Luddite Fallacy (as it has become known in development economics) were true, we would all be out of work by now—as a result of the compounding effects of productivity. While technological progress may cause workers with out-dated skills to become redundant, the past two centuries have shown that the idea that increasing productivity leads axiomatically to widespread unemployment is nonsense...
[there is]... the disturbing thought that, sluggish business cycles aside, America's current employment woes stem from a precipitous and permanent change caused by not too little technological progress, but too much. The evidence is irrefutable that computerised automation, networks and artificial intelligence (AI)—including machine-learning, language-translation, and speech- and pattern-recognition software—are beginning to render many jobs simply obsolete....
... The argument against the Luddite Fallacy rests on two assumptions: one is that machines are tools used by workers to increase their productivity; the other is that the majority of workers are capable of becoming machine operators. What happens when these assumptions cease to apply—when machines are smart enough to become workers? In other words, when capital becomes labour. At that point, the Luddite Fallacy looks rather less fallacious.
This is what Jeremy Rifkin, a social critic, was driving at in his book, “The End of Work”, published in 1995. Though not the first to do so, Mr Rifkin argued prophetically that society was entering a new phase—one in which fewer and fewer workers would be needed to produce all the goods and services consumed. “In the years ahead,” he wrote, “more sophisticated software technologies are going to bring civilisation ever closer to a near-workerless world.”
...In 2009, Martin Ford, a software entrepreneur from Silicon Valley, noted in “The Lights in the Tunnel” that new occupations created by technology—web coders, mobile-phone salesmen, wind-turbine technicians and so on—represent a tiny fraction of employment... In his analysis, Mr Ford noted how technology and innovation improve productivity exponentially, while human consumption increases in a more linear fashion.... Mr Ford has identified over 50m jobs in America—nearly 40% of all employment—which, to a greater or lesser extent, could be performed by a piece of software running on a computer...
In their recent book, “Race Against the Machine”, Erik Brynjolfsson and Andrew McAfee from the Massachusetts Institute of Technology agree with Mr Ford's analysis—namely, that the jobs lost since the Great Recession are unlikely to return. They agree, too, that the brunt of the shake-out will be borne by middle-income knowledge workers, including those in the retail, legal and information industries...
Even in the near term, the US Labor Department predicts that the 17% of US workers in "office and administrative support" will be replaced by automation.
It's not only the winners of the 1st world birth lottery that are threatened.
China's Foxconn (Taiwan based) employs about 1 million people. Many of them will be replaced by robots.
It's disruptive, but given time we could adjust. Today's AIs aren't tweaking the permeability of free space; there are still a few things we do better than they. We also have complementary cognitive biases; a neurotypical human with an AI in the pocket will do things few unaided humans can do. Perhaps even a 2045 AI will keep human pets for their unexpected insights. Either way, it's a job.
Perhaps more interestingly, a cognitively disabled human with a personal AI may be able to take on work that is now impossible.
Economically, of course, the productivity/consumption circuit has to close. AIs don't (yet) buy info-porn. If .1% of humans get 80% of revenue, then they'll be taxed at 90% marginal rates and the 99.9% will do subsidized labor. That's what we do for special needs adults now, and we're all special needs eventually.
So, given time, we can adjust. Problem is, we won't get time. We will need to adjust even as our world transforms exponentially. It could be tricky.
See also: