Then reality set in. Contrary to popular belief, brain decay is not a late life disorder. It starts in our twenties ....
This Is Your Faulty Brain, On a Microchip - Memory forever - Gizmodo
... Starting in your 20s—not old age—behavioral evidence suggests that you enter a linear cascade of general cognitive decline....The Gizmodo article, clearly written by a young chap, imagines we'll outsource our recall and declining cognition to an onboard chip (vs., say Google). Sure.
This decline is notably seen in tasks that are highly mentally demanding, like speed of processing (how quickly you handle incoming information), attention, working memory (how well you manipulate and keep information active in your mind), and, of course, long term memory.
In real life, these effects are seen in everything from how long it takes to learn a new skill to how quickly you can recall a factoid....
While we're waiting to be chipped, however, knowledge work is becoming ever more demanding - and non-knowledge work doesn't pay too well (unless you're CEO, then non-knowledge work can pay very well).
In the post-modern world, unless we can bend that decay curve (hello? dementia meds?) many of us will have a hard time doing competitive knowledge work into our 60s - much less our 70s. Bagging groceries yes - genomic engineering not so much.
That could be a bit of an economics problem.
We really should be spending more money on trying to bend that curve. We need my generation to earn money until we take our dirt bath.