I noticed a while back that eating disorders were no longer an active area of public anxiety. So I looked at Google Trends since 2004 (click for full size)
Tuesday, May 14, 2024
Medicine and culture: searches on anorexia from 2004 to 2024 declined by over 60%
Wednesday, August 30, 2023
Mass disability - dysfunctional web sites, apps for everything
I last wrote about "mass disability" and the Left Behind in a 2021 post. The concept has sometimes seemed on the edge of going mainstream but it's never quite made it. Maybe we're getting closer; a recent Michael Tsai post (No App, No entry) reminded me of my Mastodon thread from a few weeks ago:
What is the crazy that drives Trumpism and the many global equivalents?
It is that the minimal IQ to function well in the modern world is now about 120 and that eliminates most people.This is both the most important fact of our time and the least palatable. It is the thing that cannot be said and it will be ruin of us if we don't say it ...I've been saying this for years. Today I was reminded of it while doing some travel booking.
During the bookings I encountered:1. A web site that didn't work with my older version of Safari (I knew what was wrong and switched to Chrome.
2. A Delta web site bug (I recognized it as a bug and knew what to do).
3. Place that was out of rental cards but I new Expedia would have some contracts that would let me find one.
4. Travel web sites that all needed new credentials...
... These are all routine parts of modern life including maintaining flaky computer systems (let me tell you ...) and phones ...It was not like this even 35 y ago. Travel agents handled travel complexity. There were no smartphones. Computers were very limited. There was no internet for most. By necessity everyday life was much simpler. Most people could cope with it.Now most cannot cope.This is the most important feature of our time. And nobody can talk about it.
I remember some good discussions on this thread but I can't find any of them now. Perhaps by design Mastodon has a limited memory. (My home instance has no search, so I had to download my archive and search it to find the date of the post. Then I could slowly navigate to it.)
I expanded on the theme a bit later:
Hotel laundry year 2000
1. Insert quarters to buy detergent, operate washer and dryer.
IQ requirement: 65 (my son could do this after a demonstration)
Hotel laundry year 2023
1. Scan QR code to download app whose profit comes from unspent funds.
2. Install app, create account with Apple ID
3. Figure out cryptic UX so can deposit funds (several odd unintuitive steps)
3. Deposit funds, paying just enough to cover this sesh. Pat the 25 cent low transaction penalty...
4. Spot the scam behind app and avoid it (eg find at minimum)
5. Diagnose why after paying money and confirming machine it’s still not working
6. Authorize specific transaction
7. Start laundry.
(My son could not do this)
8. When complete delete app.
IQ requirement: minimum 110, higher to spot the scam.
This is why America is burning.
People are scared and angry and feeling left behind -- and they can't come out and say they are unable to manage their tech. Because that's the ultimate shame.
See also:
- Biden's essential task is to help the Left Behind 1/2021
- Complexity, cognition, civilization 11/2021
- What is the middle class and why can't most Americans get there? 11/2020
- Broken world: applying for a minimum wage job via a corporate HR web site 3/2017
- Trumpism and transition to mass disability 8/2016
- Is most American poverty actually disability? 10/2013
- The post-AI era is also the era of mass disability 12/2012 (we've been in the AI world for years)
- Unemployment and. the new economy 1/2011 (rather than unemployment we ended up over the next decade with a shrinking middle class and more people on the edge).
- Mass disability and the middle class 9/2011
Saturday, December 31, 2016
Crisis-T: blame it on the iPhone (too)
It’s a human thing. Something insane happens and we try to figure out “why now?”. We did a lot of that in the fall of 2001. Today I looked back at some of what I wrote then. It’s somewhat unhinged — most of us were a bit nuts then. Most of what I wrote is best forgotten, but I still have a soft spot for this Nov 2001 diagram …
I think some of it works for Nov 2016 too, particularly the belief/fact breakdown, the relative poverty, the cultural dislocation, the response to modernity and changing roles of women, and the role of communication technology. Demographic pressure and environmental degradation aren’t factors in Crisis-T though.
More than those common factors I’ve blamed Crisis-T on automation and globalization reducing the demand for non-elite labor (aka “mass disability”). That doesn’t account for the Russian infowar and fake news factors though (“Meme belief=facts” and “communications tech” in my old diagram). Why were they so apparently influential?
Maybe we should blame the iPhone …
Why Trolls Won in 2016 Bryan Mengus, Gizmodo
… Edgar Welch, armed with multiple weapons, entered a DC pizzeria and fired, seeking to “investigate” the pizza gate conspiracy—the debunked theory that John Podesta and Hillary Clinton are the architects of a child sex-trafficking ring covertly headquartered in the nonexistent basement of the restaurant Comet Ping Pong. Egged on by conspiracy videos hosted on YouTube, and disinformation posted broadly across internet communities and social networks, Welch made the 350-mile drive filled with righteous purpose. A brief interview with the New York Times revealed that the shooter had only recently had internet installed in his home….
…. the earliest public incarnation of the internet—USENET—was populated mostly by academia. It also had little to no moderation. Each September, new college students would get easy access to the network, leading to an uptick in low-value posts which would taper off as the newbies got a sense for the culture of USENET’s various newsgroups. 1993 is immortalized as the Eternal September when AOL began to offer USENET to a flood of brand-new internet users, and overwhelmed by those who could finally afford access, that original USENET culture never bounced back.
Similarly, when Facebook was first founded in 2004, it was only available to Harvard students … The trend has remained fairly consistent: the wealthy, urban, and highly-educated are the first to benefit from and use new technologies while the poor, rural, and less educated lag behind. That margin has shrunk drastically since 2004, as cheaper computers and broadband access became attainable for most Americans.
… the vast majority of internet users today do not come from the elite set. According to Pew Research, 63 percent of adults in the US used the internet in 2004. By 2015 that number had skyrocketed to 84 percent. Among the study’s conclusions were that, “the most pronounced growth has come among those in lower-income households and those with lower levels of educational attainment” …
… What we’re experiencing now is a huge influx of relatively new internet users—USENET’s Eternal September on an enormous scale—wrapped in political unrest.
“White Low-Income Non-College” (WLINC) and “non-elite” are politically correct [1] ways of speaking about the 40% of white Americans who have IQ scores below 100. It’s a population that was protected from net exposure until Apple introduced the first mass market computing device in June of 2007 — and Google and Facebook made mass market computing inexpensive and irresistible.
And so it has come to pass that in 2016 a population vulnerable to manipulation and yearning for the comfort of the mass movement has been dispossessed by technological change and empowered by the Facebook ad-funded manipulation engine.
So we can blame the iPhone too.
- fn -
[1] I think, for once, the term actually applies.
Tuesday, December 20, 2016
Save America. Vote GOP.
In the real world HRC is President and the GOP is beginning a painful reform process that will lead to a far better conservative party and a healthy American democracy.
In our consensus hallucination a walking tire fire is President, the GOP is further from reform than ever, and smart Dems are reading Josh Marshall’s advice. Oh, and the wake-up button isn’t working.
While we’re waiting for wakefulness we might as well come up with a plan or two. Plan one is to address the root cause of non-college misery. That will be useful if we survive (hint: avoid war with China) to get a sane government again.
Plan two is about getting a sane government. Towards that end we need to save the GOP from its addiction to the unreal. Unreality is a dangerous drug, after decades of abuse the GOP is in desperate need of rehab …
From Tabloids to Facebook: the Reality Wars (revised from my original)
I’ve been thinking about Russia’s successful hacking of the 2016 US election. It shouldn’t be seen in isolation.
It should be understood as part of the ancient human struggle with delusion and illusion — the reality wars.
In the US the reality wars were once bipartisan; each party struggled to separate fact from fantasy. Over the past few decades the GOP stopped fighting, they embraced the unreal. From Reagan to Gingrich to the Tea Party to Trump. By the 21st century we began seeing books like “The Republican War on Science”.
Unreality spread like a virus. AM talk radio was infested. Then came Drudge and Fox. Later Breitbart and finally the Facebook fake news stream. From the Clinton “murders” to birtherism to child pizza porn slaves.
This wasn’t bipartisan. The anti-reality meme, a core historic component of fascism, became concentrated in the GOP. Russia jumped on board, but Russia is more of a plague carrier than an intelligent agent. They lost their reality-war in the 90s. All their news is unreal now. Putin, like Trump, takes the fakes.
Trump’s victory is a triumph of the unreal. Of Will, I suppose. Now it threatens us all.
The rebellion against reason, against the perception of the real, is old. It’s a core component of fascism, but it’s much older than fascism. The Enlightenment was a setback for the unreal, but it wasn’t a final defeat. Now, in our troubled 3rd millennium, anti-reason is strong. It has taken over Russia. It has taken over the GOP, and Trump’s GOP has taken over America.
Somehow we have to rescue the GOP from it’s addiction to the unreal. That would be hard if it had been defeated. Now it seems impossible.
But there is a way. We need to vote GOP.
Vote GOP … in the primaries that is. In my home of Minnesota the Dem contenders are all pretty reasonable. I can send some money and volunteer to support the party, but my primary/caucus vote isn’t needed. On the other hand, the Minnesota GOP has lots of reality denialists running for office. I can use my primary vote to favor relatively sane GOP contenders.
If even half of Dems vote GOP in primaries we can ally with sane conservatives to pull the GOP back from the brink. Yes, there are a few sane conservatives. They are a dying breed, but there is room to ally with them here.
Then, in the election, we vote Dem. If America is lucky the Dems win. If America is unwise the GOP wins — but it’s a saner GOP. A setback, but not a catastrophe.
Work for a sane GOP. As a good Dem, vote GOP.
Sunday, October 16, 2016
How to give believers an exit from a cause gone bad
How do you give someone who has committed themselves to a bad cause a way out? You don’t do it by beating on how stupid they are …
From How to Build an Exit Ramp for Trump Supporters (Deepak Malhotra)
- Don’t force them to defend their beliefs … you will be much more effective if you encourage people to reconsider their perspective without saying that this requires them to adopt yours.
- Provide information, and then give them time … change doesn’t tend to happen during a heated argument. It doesn’t happen immediately.
- Don’t fight bias with bias … the one thing you can’t afford to lose if you want to one day change their mind: their belief about your integrity. They will not acknowledge or thank you for your even-handedness at the time they’re arguing with you, but they will remember and appreciate it later, behind closed doors. And that’s where change happens.
- Don’t force them to choose between their idea and yours. … you will be much more effective if you encourage people to reconsider their perspective without saying that this requires them to adopt yours.
- Help them save face…. have we made it safe for them to change course? How will they change their mind without looking like they have been foolish or naïve?
- Give them the cover they need. Often what’s required is some change in the situation—however small or symbolic—that allows them to say, “That’s why I changed my mind.” … For most people, these events are just “one more thing” that happened, but don’t underestimate the powerful role they can play in helping people who, while finally mentally ready to change their position, are worried about how to take the last, decisive step.
- Let them in. If they fear you will punish them the moment they change their mind, they will stick to their guns until the bitter end. This punishment takes many forms, from taunts of “I told you so” to being labeled “a flip-flopper” to still being treated like an outsider or lesser member of the team by those who were “on the right side all along.” This is a grave mistake. If you want someone to stop clinging to a failing course of action or a bad idea, you will do yourself a huge favor if you reward rather than punish them for admitting they were wrong…You have to let them in and give them the respect they want and need just as much as you.
If you’re a Vikings fan feuding with your brother-in-law from Green Bay feel free the break all these rules. If you’re worried about the future of civilization you might try this instead.
For #5, saving face, look for something they could have been right about. To a climate changer denier, agree that solar output varies. To a Trump follower, agree that the bleak future of the non-college adult wouldn’t have gotten attention without his focus.
I’m adding this recipe to the Notes collection I carry on my phone.
Sunday, August 28, 2016
Trumpism: a transition function to the world of mass disability.
We know the shape of the socioeconomic future for the bottom 40% in the post globalization post AI mass disability world.
But how do we get there? How does a culture transition from memes of independence and southern Christian-capitalist marketarianism to a world where government deeply biases the economy towards low-education employment?
There needs to be a transition function. A transform that is applied to a culture. With the anthropology perspective I’ve long sought Arlie Hochschild makes the case that Trump is, among other things, a transition function that erases Tea Party Marketarianism and embraces the heresy of government support (albeit for the “deserving”).
In a complex adaptive system we get the transition function we need rather than the one we want. No guarantee we survive it though.
See also:
- Trumps is a Politics of Loss and Revenge 8/2016: Marshall demolishes false dichotomy between economic distress and white racism.
- Gordon's Notes: Donald Trump is a sign of a healthy democracy. Really. 8/2015. Well, today I’d write “a desperate democracy”, but my tough love take on why we have this pathologic human being contending to more or less destroy civilization.
- Trump explained: Non-college white Americans now have higher middle-aged death rates than black Americans 11/2015. Marshall does a better job in his 2016 post.
Friday, August 19, 2016
Crab Bucket
Terry Pratchett taught me about “crab bucket” in Unseen Academicals [1]. I don’t know if it’s a metaphor of his part of England, or if it’s unique to the Discworld.
… She reached down and picked a crab out of a bucket. As it came up it turned out that three more were hanging on to it…
… ‘Oh that’s crabs for you,’ said Verity … ‘Thick as planks the lot of them. That’s why you can keep them in a bucket without a lid. Any that tries to get out gets pulled back…’
Crab bucket, thought Glenda … That’s how it works. People from the Sisters disapproving when a girl takes the trolley bus … Practically everything me mum ever told me…
I did find a wikipedia entry for “crab mentality”, which led to a 1994 article …
When teachers at Frank W. Ballou … talk about the crab bucket syndrome …
But the author doesn’t describe where the term comes from. It’s a useful concept; reminds me again how much we need to recreate anthropology.
[1] Written when Pratchett was well into his eventually terminal dementia syndrome, so while it’s very enjoyable for fans it’s not his best work.
Monday, November 02, 2015
Trump explained: Non-college white Americans now have higher middle-aged death rates than black Americans
From today’s NYT Health section:
Death Rates Rising for Middle-Aged White Americans. Gina Kolata Nov 2, 2015
… middle-aged white Americans. Unlike every other age group, unlike every other racial and ethnic group, unlike their counterparts in other rich countries, death rates in this group have been rising, not falling…
… two Princeton economists, Angus Deaton… and Anne Case. Analyzing health and mortality data from the Centers for Disease Control and Prevention and from other sources, they concluded that rising annual death rates among this group are being driven … by an epidemic of suicides and afflictions stemming from substance abuse: alcoholic liver disease and overdoses of heroin and prescription opioids…
… the declining health and fortunes of poorly educated American whites. In middle age, they are dying at such a high rate that they are increasing the death rate for the entire group of middle-aged white Americans…
… The mortality rate for whites 45 to 54 years old with no more than a high school education increased by 134 deaths per 100,000 people from 1999 to 2014.
The article falls apart a bit here. What we want to know is how the absolute death rate for non-college middle-aged white Americans in 2013 and in 1999. We want to know how the Long Stagnation has changed vulnerable Americans, but Kolata’s article mixes all white Americans with the no-college cohort.
Fortunately the PNAS article PDF is freely available, but unfortunately it explains Kolata’s problem — the data we want seems to be buried in an unlabeled parenthesis in Table 1. From that I think I can reconstruct the key information: [1].
Year | White no college | Black (all) | White some college | White BA+ | White All |
---|---|---|---|---|---|
1999 | 601 | 797 | 291 | 235 | 381 |
2013 | 736 | 582 | 288 | 178 | 415 |
For the no-college White American 1999 was a pretty good year; probably the best ever. That was the era of NASCAR America and the candidacy of GWB, champion of the “regular” white guy. Employment demand was high and wages were rising. Yes, as a white guy without any college you had a shorter lifespan than the minority of white (Americans) with a college degree, but at least black Americans were even worse off. It’s always comforting to have someone to look down on.
After 16 years of the Great Stagnation though, things are different. Suicide and substance abuse have pushed no-college white mortality to the level of 1999 black Americans, yet during the same period black American middle-aged mortality has fallen substantially. White no-college Americans are now at the bottom of the heap [1].
This is why we have the inchoate white rage that thunders through the GOP. This is why we have Donald Trump.
A large and culturally powerful part of America is in crisis. A cohort with lots of guns and a history of violence. Maybe we should pay attention. Trump is a signal.
- fn -
[1] There was no breakdown of black death rates by education; a 2012 census report said 29% of whites and 18% of blacks had a BA or higher. Since 80%+ of black Americans have no BA it’s likely no-college whites now have higher middle-aged mortality than no-college blacks.
See also
Update 11/4/2015
There’s been considerable coverage of this story, but it’s been disappointing. Both DeLong and Krugman missed the college vs. no-college white middle-age cohort, and I think that’s the important story. There’s also been some discussion of anger as a defining trait of the GOP base, but no connection to the extreme distress of their core voter.
I’ve seen speculation that this is all about narcotic overuse. I find that very hard to believe, but I admit the use of narcotics for pain relief in America has exceeded my expectations. I remember in the 90s when “pain is the new vital sign” and family docs were berated for inadequate use of narcotics. I guess my peers responded well to that feedback.
It has occurred to me that there’s a potential bias we’re missing. Over the past 40 years colleges have gone from predominantly male to predominantly female. The big story here is increasing mortality in the no-college white cohort. But if there’s been a gender shift in that cohort, say from 55% female in 1999 to 45% female in 2013, that will make the no-college numbers even more dramatic. Since mortality has increased even when college grads are included this isn’t the entire story, but it will make the no-college effect more dramatic.
Thursday, October 29, 2015
Capitalism, fraud and maximizing wantability
WaPo has a delightfully meta-subversive headline for an article about the failings of 21st century capitalism: This Kardashian headline shows why two Nobel winners say the economy is broken. Beneath the headline is a photograph of 3 reasonably attractive women and the hit enhancing text “Kourtney, Kim and Khloe — arrive at the Maxim Hot 100 party”.
Jeff Guo’s article proceeds to an interview with Akerloff and Shiller, reasonably well regarded academic economists, about their book Phishing for Phools. Unfortunately Guo does get around to the Kardashians, which blunts the beauty of the introduction. Still, it is a lovely bit of meta; boosting page hits for an article about how easily humans are manipulated in the interests of feeding their wants.
Shockingly, it seems capitalism does not optimize our better selves.
I’ll let that sink in a bit.
Sure, you think it’s obvious that capitalism is a system for finding local minima traps in a 3 dimensional field where demand is gravity and information technology enables complexity enables deception. If pressed to respond further you might say something like “tobacco”.
It’s not obvious to Americans though. Our culture equates wealth with virtue, and the “invisible hand” of capitalism with the “invisible hand” of a Calvinistic God. It’s an authoritian-dominance attractor in culture-space, and we’re not the only people to get stuck in it.
So this is an article worth scanning, if only as a marker for the fading glamor of the 1990s capitalist (emphases mine) …
… Economics predicts that wherever there is a profit, someone will be there to make it. To that, Akerlof and Shiller propose a corollary: Wherever there is an opportunity to profit off people’s weaknesses, someone will exploit it…
… The basic idea of this book is that there is a “phishing” equilibrium, in which if there’s a profit to be made by taking advantage of your weakness, then that will be there.
… The standard view of markets (which is subject to problems of income distribution and externalities) is that markets will deliver the best possible outcome.
… that’s what the standard graduate student is taught. It’s what you’re told to believe, and what I think most economists do believe. As long as the markets are competitive, and there are no problems of income distribution and there are no externalities, it’s going to lead to the best possible world…
… that then has acquired a moral tone, which is that whatever happens in the market is okay. And that translates, in turn, into people arguing and thinking that it’s okay to be selfish. That if I earn this income, then I in some sense deserve it.
So this view that whatever markets do is good becomes this idea that whatever markets do is right…
… Kirman tracing the origins of this idea back to the Enlightenment. He says, “laissez faire made a lot of sense against the background of monarchy and controlling church.” So this idea of freeing the markets really came through at a time when businesses were being particularly oppressed….
… Irving Fisher was a Yale economist who in 1918 wrote a book saying the free market system is maximizing something but it’s not what Jeremy Bentham, the philosopher, called utility. So he named it wantability.
I did a Google N-grams search [how often a word appears in books] for wantability. The term enjoyed some popularity in the 1920s and 1930s, then exponentially decayed. After the Reagan-Thatcher revolution the term was gone….
… the children’s candy bars were put at children’s eye level …You have professionals who are designing everything. They are designing it for wantability.
Reading this a part of me thinks I should get a Nobel just for my blog rants. Economists don’t think market solutions have local minima traps? It’s novel to think markets produce things that are bad for us? Stockholm, it’s not that hard to find my real identity. I would’t mind the money. You can give me another prize for canopy economics and eco-econ.
So this isn’t a book I’m likely to buy. It’s an interesting marker, however, of our changing attitudes towards market capitalism and for the intellectual history of our judgments from Adam Smith to Donald Trump. Twenty years of lousy economic growth (great for elite, awful for non-college) will do that. I’ll be looking for more signs of thoughtfulness …
See also
- IT and productivity - two noteworthy posts from Equitable Growth 2/2015 - good set of links to my older stuff.
- Learning from an Amazon “Newer Galaxy” fraud: I too am prey. 10/2015. Just this morning! I got caught.
Thursday, July 02, 2015
Apple TV's PBS station has great shows, but an awful user experience
Anthro prof John Hawks, one of my favorite bloggers, is hosting a sure-to-be-good PBS series called First Peoples. You can stream the (ugh) Flash (ugh) version … too bad it’s not on Apple TV….
Except … it is. It’s just bloody hard to find. The PBS Channel Search menu searches only “Videos”, not “Shows”. To find Shows you have to scroll around … and around … and around… the “Shows” screen. Good luck with that. Once you find a show you can add it to Favorites.
I don’t know if this is something PBS can fix or if it’s some sort of Apple malfunction, but … wow … needs a fix. At the least Search needs to include Shows.
Thursday, November 14, 2013
Human domestication, the evolution of beauty, and your wisdom tooth extraction
My 16yo is having his wisdom teeth removed tomorrow. Blame it on human domestication.
The Economist explains the process. Domestication, whether it occurs in humans, foxes, or wolves, involves changes to "estradiol and neurotransmitters such as serotonin" (for example). These changes make humans less violent and better care givers and partners -- major survival advantages for a social animal. They also have unexpected side-effects, like shortened muzzles and flattened faces for wolves, foxes, (cows?) and humans.
Since domesticated humans out-compete undomesticated humans, the physiologic markers of domestication become selected for. They being to appear beautiful. Sex selection reinforces the domestication process.
It seems to be ongoing ...
The evolution of beauty: Face the facts | The Economist:
... People also seem to be more beautiful now than they were in the past—precisely as would be expected if beauty is still evolving...
Which may be why we are becoming less violent.
Of course a shortened muzzle and smaller mandible have side-effects. Teeth in rapidly domesticating animals don't have room to move. Which is good news for orthodontists, and bad news for wisdom teeth.
See also:
- A link between autism and the domestication of humans and dog: "MET gene selection appears to be related to human and canine domestication, including sociability. Variants of it are associated with autism-like behavior."
- The canid domestication of homo sapiens brutalis
Wednesday, November 14, 2012
Baumol's disease and the demographic transition: Productivity asymmetry means children are increasingly expensive
There have been a flurry of articles lately on the cost of raising an American child, including a NYT blog post. This old news to students of demography, I remember reading about this back in the 80s. In agricultural societies children are a net economic gain, in industrial societies the gain is less, and in post-industrial societies they are a net negative for parents.
I should add, by the way, that traditional evaluations of the cost of children underestimate the cost. They assume a healthy neurotypical child. A special needs child is vastly more expensive, and approximately 5-10% of post-industrial children are relatively disabled by 'autism' (whatever that is), low average IQ, schizophrenia, affective disorders and the like.
So this is old material, but I don't recall a theoretical framework explaining why child raising takes a larger and larger proportion of income as a society becomes wealthier. The answer, of course, is Baumol's cost disease.
Child raising is one of those tasks with minimal productivity increases. Indeed, as output requirements rise to meet the narrowing demands of a post-labor economy, productivity may be negative over time. Certainly much of the cost is related to education and health care, two domains with notoriously slow productivity growth. Baumol's work teaches us that as overall productivity and wealth increase, relatively low productivity labor will consume increasing fractions of total income.
This seems to be an obvious insight, but a cursory Google search didn't find any articles or posts connecting demographic transitions with Baumol's Cost Disease. Until now.
Sunday, September 23, 2012
Islamic rage and free speech.
A guy with a big chip on his shoulder isn't going to do well. He's easy to play with, but even without manipulation he's going to wreck his life.
A lot of the Islamic world is like that, probably because the population is young, social structures are intensely patriarchal, employment opportunities suck, education is distorted by religious doctrine and the world is a very scary place. Just be glad the "The End of Men" isn't a video. Threats to patriarchal structures are not entirely imaginary.
But it's not just Islam. The anti-Japanese riots in China look awfully to Cairo; and does anyone remember burning cities in 1960s America?
So national rage isn't as simple as a cultural trait unique to Islam.
What about free speech, how simple is that?
As any high school student should be able to describe, we don't practice completely free speech in the US. The NYT mentions a couple of examples but Wikipedia has more ...
Freedom of speech in the United States - Wikipedia
... the Miller test for obscenity, child pornography laws, speech that incites imminent lawless action, and regulation of commercial speech such as advertising. Within these limited areas, other limitations on free speech balance rights to free speech and other rights, such as rights for authors and inventors over their works and discoveries (copyright and patent), protection from imminent or potential violence against particular persons (restrictions on fighting words), or the use of untruths to harm others (slander)...
... Publishing, gathering or collecting national security information is not protected speech in the United States ... The unauthorized creation, publication, sale or transfer of photographs or sketches of vital defense installations or equipment as designated by the President is prohibited.[13] The knowing and willful disclosure of certain classified information is prohibited ... It is prohibited for a person who learns of the identity of a covert agent through a "pattern of activities intended to identify and expose covert agents" to disclose the identity to any individual not authorized access to classified information, with reason to believe that such activities would impair U.S. foreign intelligence efforts..
Julian Assange could say something about national security and restrictions on free speech -- though the material he published was widely redistributed in the US without much consequence. The US didn't exercise blocks on web servers. Other information is more restricted. Although the wikipedia article on nuclear weapon design is pretty detailed, more practical 'make your own terror weapon' recipes are hard to find.
Even videos of this type of restricted material would be unlikely to provoke popular outrage. Neither would videos attacking religious beliefs, politicians, science (the GOP makes those) or history -- denial of American slavery, Amerindian genocide or the Holocaust are all protected speech.
The only exception I can think of would be child pornography. Videos that use children under the age of 12 in sexual or harmful activities are strictly illegal in the US and would create anger and disgust. I can't see riots, though and, of course, the Islamic reaction would be at least as negative.
National rage is not uniquely Islamic, but protected speech is pretty distinctly American. Perhaps a consequence of that protected speech is that while sticks and stones will trigger invasions, "names will never hurt us".
Saturday, April 21, 2012
Optimism bias in Potter fan-fic, software development, and government - we can correct
There may be atheists in foxholes, but there are few realists in the C-suite - or the White House.
Optimists rule, and they scorn realists as "pessimists" and "Cassandras" [1]. No matter than Kassandra Krugman is always right - still he is called Crow.
It smells like natural selection. In a universe where entropy rules, denial is a survival trait. Group selection, however, sprinkles a few realists about - grumpily cursed (by Apollo) to see things as they are.
Yes, the glass is half full. But that's good, because the wine is poisoned.
I think we're an oppressed minority.
No wonder realists love empiricism. Facts are our friends. We realists welcome science disguised in Harry Potter fan-fic ...
Harry Potter and the Methods of Rationality, Chapter 6: The Planning Fallacy (I added some paragraphs, emphases mine)
... "Muggle researchers have found that people are always very optimistic, like they say something will take two days and it takes ten, or they say it'll take two months and it takes over thirty-five years. Like, they asked students for times by which they were 50% sure, 75% sure, and 99% sure they'd complete their homework, and only 13%, 19%, and 45% of the students finished by those times. And they found that the reason was that when they asked people for their best-case estimates if everything went as well as possible, and their average-case estimates if everything went as normal, they got back answers that were statistically indistinguishable....
.... See, if you ask someone what they expect in the normal case, they visualize what looks like the line of maximum probability at each step along the way - namely, everything going according to plan, without any mistakes or surprises. But actually, since more than half the students didn't finish by the time they were 99% sure they'd be done, reality usually delivers results a little worse than the 'worst-case scenario'....
... It's called the planning fallacy, and the best way to fix it is to ask how long things took the last time you tried them. That's called using the outside view instead of the inside view. But when you're doing something new and can't do that, you just have to be really, really, really pessimistic. Like, so pessimistic that reality actually comes out better than you expected around as often and as much as it comes out worse. It's actually really hard to be so pessimistic that you stand a decent chance of undershooting real life. Like I make this big effort to be gloomy and I imagine one of my classmates getting bitten, but what actually happens is that the surviving Death Eaters attack the whole school to get at me...
It's music to my ears.
In my small world I see this every day. My optimist friend tells me it takes 30 minutes to enter expenses, but I track these things and I know it takes 1-2 hours. Another optimist says we'll deliver a new software feature in two months; I know that five months is optimistic and 8 months more realistic.
When we follow Agile Software Development rules, however, we base our estimates on examples from previous "sprints". We take "take the outside view". It works!
The Outside View is why Chile makes reasonable predictions about government finance, while elected officials force the CBO an artificial Planning Fallacy... (emphases mine ....).
Bias in Government Forecasts | Jeff Frankels (via Mark Thoma)
Why do so many countries so often wander far off the path of fiscal responsibility? Concern about budget deficits has become a burning political issue in the United States, has helped persuade the United Kingdom to enact stringent cuts despite a weak economy, and is the proximate cause of the Greek sovereign-debt crisis, which has grown to engulf the entire eurozone. Indeed, among industrialized countries, hardly a one is immune from fiscal woes.
Clearly, part of the blame lies with voters who don’t want to hear that budget discipline means cutting programs that matter to them, and with politicians who tell voters only what they want to hear. But another factor has attracted insufficient notice: systematically over-optimistic official forecasts.
... Over the period 1986-2009, the bias in official U.S. deficit forecasts averaged 0.4 % of GDP at the one-year horizon, 1% at two years, and 3.1% at three years. Forecasting errors were particularly damaging during the past decade. The U.S. government in 2001-03, for example, was able to enact large tax cuts and accelerated spending measures by forecasting that budget surpluses would remain strong. The Office of Management and Budget long turned out optimistic budget forecasts, no matter how many times it was proven wrong. For eight years, it never stopped forecasting that the budget would return to surplus by 2011, even though virtually every independent forecast showed that deficits would continue into the new decade unabated.
... to get optimistic fiscal forecasts out of the Congressional Budget Office a third, more extreme, strategy was required....
... To understand the third strategy, begin with the requirement that CBO’s baseline forecasts must take their tax and spending assumptions from current law. Elected officials in the last decade therefore hard-wired over-optimistic budget forecasts from CBO by excising from current law expensive policies that they had every intention of pursuing in the future. Often they were explicit about the difference between their intended future policies and the legislation that they wrote down.
Four examples: (i) the continuation of wars in Afghanistan and Iraq (which were paid for with “supplemental” budget requests when the time came, as if they were an unpredictable surprise); (ii) annual revocation of purported cuts in payments to doctors that would have driven them out of Medicare if ever allowed to go into effect; (iii) annual patches for the Alternative Minimum Tax (which otherwise threatened to expose millions of middle class families to taxes that had never been intended to apply to them); and (iv) the intended extension in 2011 of the income tax cuts and estate tax abolition that were legislated in 2001 with a sunset provision for 2010, which most lawmakers knew would be difficult to sustain...
...how can governments’ tendency to satisfy fiscal targets by wishful thinking be overcome? In 2000, Chile created structural budget institutions that may have solved the problem. Independent expert panels, insulated from political pressures, are responsible for estimating the long-run trends that determine whether a given deficit is deemed structural or cyclical.
The result is that, unlike in most industrialized countries, Chile’s official forecasts of growth and fiscal performance have not been overly optimistic, even in booms. The ultimate demonstration of the success of the country’s fiscal institutions: unlike many countries in the North, Chile took advantage of the 2002-2007 expansion to run substantial budget surpluses, which enabled it to loosen fiscal policy in the 2008-2009 recession ...
Humans are programmed to be foolishly optimistic, but group selection keeps realists around so that famines don't quite kill everyone.
If we know that our programming is defective, we can correct. Realists know we can learn, because sometimes we do learn.
[1] Update: I should add that Cassandra, was, of course, the ultimate realist. She was always right. Her Curse wasn't pessimism, it was that the Apollo made humans deaf to her warnings. The ancient Greeks apparently understood the planning fallacy. True pessimists probably exist, but they are rare enough that one should consider coexisting clinical depression.
Saturday, March 24, 2012
Liberals and conservatives - it's in our programming
It helps if you think of humans as biological robots with varying programming ...
Politics, Odors and Soap - Kristof - NYTimes.com
... “The Righteous Mind,” by Jonathan Haidt, a University of Virginia psychology professor, argues that, for liberals, morality is largely a matter of three values: caring for the weak, fairness and liberty. Conservatives share those concerns (although they think of fairness and liberty differently) and add three others: loyalty, respect for authority and sanctity...
It has some face validity; it certainly fits my own values (hint - 'respect for authority' is not one of them, which is a bit of a disadvantage in corporate and military settings). I don't think it captures the full difference however; conservatives [1] and liberals have very different attitudes towards the (non-genetically related) weak. Perhaps in conservatives of all colors, cultures, times, and places loyalty is tied to notions of tribe (race) and family. The unrelated weak are a distant abstraction.
It also doesn't fully explain the historical paranoia of the American right (peaking again) in a whitewater world. For that we need to look at responses to novelty as well as to threats to power, tribe, and authority.
The model isn't complete then, but it's useful. It's easier to live with America's conservatives if we understand it's not their fault. Liberals and conservatives just have different operating systems; politics is our essential interface.
[1] So are libertarians a form of conservative or a third branch? They are less interested in caring for the weak than many conservatives.
Saturday, December 24, 2011
The "War on Chrismas" is not entirely delusional
The American "War on Christmas" movement seems thoroughly silly ....
Reminder: Tis the Season Not to Be an Ass – Whatever
...it’s about as silly as it ever was, considering that Christmas has conquered December, occupied November and metastasized into late October. To suggest that the holiday is under serious threat from politically correct non-Christians is like suggesting an earthworm is a serious threat to a Humvee. This is obvious enough to anyone with sense that I use The War on Christmas as an emergency diagnostic, which is to say, if you genuinely believe there’s a War on Christmas, you may want to see a doctor, since you might have a tumor pressing on your frontal lobes.
Seems silly, is silly.
And yet, I agree with TNC that Rick Perry is not completely delusional ...
Rick Perry and the Politics of Resentment - Ta-Nehisi Coates - Politics - The Atlantic
... What strikes me is the sense of being under siege, a constant theme in conservative politics. It is as if time itself is against them. And they know it. The line "I'm not ashamed to admit that I'm a Christian" stands out. Who is ashamed of this? This is a predominantly Christian country, and one of the most religious in the West. People don't "admit" their Christianity here. They proclaim it -- as the president has done repeatedly.
But what if there's something else? What if the conservatives are more perceptive and honest than the moderate liberals? I love Grant and Lincoln, but they were dead wrong in claiming that emancipation did not promote "social equality." Meanwhile the bigots who asserted that emancipation meant that Sambo would be "marryin yer daughters" were right. I wouldn't be shocked if Grant and Lincoln knew this, but also knew that to admit as much would be suicidal...
Yes, to most of the world the US seems to border on theocracy. But I was born into a true western theocracy, and it fell apart in less than 10 years ...
Quiet Revolution - Quebec History
The Quiet Revolution is the name given to a period of Quebec history extending from 1960 to 1966...
... The first major change that took place during the Quiet Revolution was the large-scale rejection of past values. Chief among these are those that Michel Brunet called “les trois dominantes de la pensée canadienne-française: l’agriculturisme, le messianisme et l’anti-étatisme” [the three main components of French Canadian thought: agriculturalism, anti-statism and messianism]. In this respect, Quebec entered resolutely into a phase of modernisation: its outlook became more secular (as opposed to religious), much of the traditionalism that characterised the past was replaced by increasingly liberal attitudes; long standing demographic tendencies, associated with a traditional rural way of life (high marriage, birth and fertility rates), were rapidly reversed ...
Quebec seemed stuck in the past -- until it lurched into the future. Societies can change very quickly.
Consider the case of the 2012 Presidential campaign. The GOP's presidential candidate will be theologically non-Christian (though culturally mainstream Protestant). The Dems candidate will be mainstream Protestant but raised partly in Islamic Indonesia.
That seems different, even if the current candidates aren't as theologically extreme as Jefferson, Adams or Madison. I would not be surprised if the religious attitudes of 2020 America were similar to those of 2000 Britain.
The religious right is right to be afraid, but wrong to think there's a conspiracy they can fight. Their foe is history, and it's hard to fight history. Just ask al Qaeda.
Saturday, November 12, 2011
What are the consequences of extreme executive income?
Despite a few hiccups in our economy, the diversion of money to executive compensation continues, particularly to the shareholder employees [1] of large publicly traded corporations. The US is in the lead, but other countries are following a similar trend.
I've seen much discussion of the trend, but not so much about the effects on corporations - regardless of social justice or market operation [2].
I don't think we know what it means, but I can make some informed guesses.
First, we can dispense with the myth that employees don't know what CEOs are paid. I suspect even people working with their arms and backs know their CEO's compensation. Certainly middle-management and knowledge workers know.
So how does that affect employees? And, perhaps more interestingly, how does it affect executives?
Employees, in most corporations today, see limited raises, underfunded projects, difficult work conditions and employment uncertainty. They do the arithmetic; half the CEO's compensation would fund all the projects they know of. This has obvious and direct effects on morale.
No, they don't imagine they'll sit in the CEO seat one day, or even another C-seat. Employees aren't that dumb.
How does this affect executives?
Well, it's a rare human who doesn't think they deserve their salary. If you pay a CEO 50 million dollars, they assume they deserve 50 million dollars. They can do arithmetic too. This must mean they are 250 times smarter, faster, wiser, stronger, and better than their superstar worker bees. They have gifts far beyond the ken of mortal men.
They make decisions accordingly.
It also moves the executive class into a different sort of reality. They still age and die, but most of the time that is forgotten. They are free of the other concerns of mortal life. They don't fly coach. They don't deal with time tracking and travel expenses. They don't have to manage their Flex accounts. Their lives are relatively complexity free.
Executive hyper-compensation may explain a lot of the poor decisions and poor returns of the modern publicly traded company. Not so much from the diversion of revenue, but from its impacts on employees and, most of all, because of its effect on executives.
--
[1] The CEO, CFO, etc of a publicly traded company are, in theory, employees of shareholders.
[2] I think this is a market failure. I've known several CEO class executives. They are not necessarily imaginative, insightful or academically intelligent, but they are always good at operating in the corporate setting, they always work very long hours, and they always sacrifice a great deal. Whether that helps the corporation or not is debatable; their selection pressures are complex. Even so, it would be reasonable to compensate a CEO of this sort at 1-2 million dollars (total) a year. We are far beyond that level of compensation at large PTCs.
There is a contrary argument of course. At a certain level of power and wealthy, individuals gain direct access to the global wealth stream. There are many ways to divert tens of millions of dollars from that stream that don't involve working for a PTC. Perhaps that's what boards are bidding against.
Saturday, October 22, 2011
In fifty years, what will our sins be?
In my early years white male heterosexual superiority was pretty much hardwired into my culture. I grew up in Quebec, so in my earliest pre-engagement years add the local theocracy of the Catholic church.
Mental illness, including schizophrenia, was a shameful sin. Hitting children was normal and even encouraged. There were few laws protecting domestic animals. There were almost no environmental protections. Children and adults with cognitive disorders were scorned and neglected. Physical disabilities were shameful; there were few accommodations for disability.
Our life then had a lot in common with China today.
Not all of these cultural attitudes are fully condemned, but that time is coming.
So what are the candidates for condemnation in 50 years? Gus Mueller, commenting on a WaPo article, suggests massive meat consumption and cannabis prohibition.
I am sure Gus is wrong about cannabis prohibition. Even now we don't condemn the ideal of alcohol prohibition; many aboriginal communities around the world still enforce alcohol restrictions and we don't condemn them. We consider American Prohibition quixotic, but not evil.
My list is not far from the WaPo article. Here's my set:
- Our definition and punishment of crime, particularly in the context of diminished capacity.
- Our tolerance of poverty, both local and global.
- Our wastefulness.
- Our tolerance of political corruption.
- Our failure to create a carbon tax.
- The use of semi-sentient animals as meat. (WaPo just mentions industrial food production. I think the condemnation will be deeper.)
- Our failure to confront the responsibilities and risks associated with the creation of artificial sentience. (Depending on how things turn out, this might be celebrated by our heirs.)
The WaPo article mentions our isolation of the elderly. I don't think so; I think that will be seen more as a tragedy than a sin. This is really about the modern mismatch between physical and cognitive lifespan.
The article is accompanied by a poll with this ranking as of 5800 votes:
- Environment
- Food production
- Prison system
- Isolation of the elderly.
Friday, July 08, 2011
Why is the modern GOP crazy?
The GOP wasn't always this crazy. Minnesota's Arne Carlson, for example, wasn't a bad governor. Schwarzenegger had his moments.
Ok, so the modern GOP has never been all that impressive. Still, it wasn't 97% insane until the mid-90s.
So what happened?
I don't think it's the rise of corporate America or the amazing concentration of American wealth. The former impacts both parties, and not all the ultra-wealthy are crazy. These trends make the GOP dully malign, but the craziness of Koch brothers ought to be mitigated by better informed greed.
That leaves voters. So why have a substantial fraction, maybe 20%, of Americans shifted to the delusional side of the sanity spectrum? It's not just 9/11 -- this started before that, though it's easy to underestimate how badly bin Laden hurt the US. It can't be just economic distress -- Gingrich and GWB rose to power in relatively good times.
What's changed for the GOP's core of north-euro Americans (aka non-Hispanic "white" or NEA)?
Well, the interacting rise of the BRIC and the ongoing IT revolution did hit the GOP-voting NEA very hard, perhaps particularly among "swing" voters. That's a factor.
Demographics is probably a bigger factor. I can't find any good references (help?) but given overall population data I am pretty sure this population is aging quickly. A good fraction of the core of the GOP is experiencing the joys of entropic brains (here I speak from personal white-north-euro-middle-age experience). More importantly, as Talking Points describes, this group is feeling the beginning of the end of its tribal power. My son's junior high graduating class wasn't merely minority NEA, it was small minority NEA.
This is going to get worse before it gets better. The GOP is going to explore new realms of crazy before it finds a new power base; either as a rebuilt GOP or a new party.
It's a whitewater world.
Update 7/8/11: Coincidentally, 538 provides some data on GOP craziness ....
Behind the Republican Resistance to Compromise - NYTimes.com
... Until fairly recently, about half of the people who voted Republican for Congress (not all of whom are registered Republicans) identified themselves as conservative, and the other half as moderate or, less commonly, liberal. But lately the ratio has been skewing: in last year’s elections, 67 percent of those who voted Republican said they were conservative, up from 58 percent two years earlier and 48 percent ten years ago.
This might seem counterintuitive. Didn’t the Republicans win a sweeping victory last year? They did, but it had mostly to do with changes in turnout. Whereas in 2008, conservatives made up 34 percent of those who cast ballots, that number shot up to 42 percent last year...
... the enthusiasm gap did not so much divide Republicans from Democrats; rather, it divided conservative Republicans from everyone else. According to the Pew data, while 64 percent of all Republicans and Republican-leaning independents identify as conservative, the figure rises to 73 percent for those who actually voted in 2010...
Saturday, July 02, 2011
NYT's 1982 article on how teletext would transform America
(with thanks to Joseph P for the cite).
There were familiar computing names in the 1980s - Apple, IBM and so on. There were also many now lost, such as Atari and Commodore PCs. There were networks and email and decades old sophisticated collaboration technologies now almost lost to memory.
Against that background the Institute for the Future tried to predict the IT landscape of 1998. They were looking 16 years ahead.
You can see how well they did. For reasons I'll explain, the italicized text are word substitutions. Emphases mine ...
STUDY SAYS TECHNOLOGY COULD TRANSFORM SOCIETY (June 13, 1982)
WASHINGTON, June 13— A report ... made public today speculates that by the end of this century electronic information technology will have transformed American home, business, manufacturing, school, family and political life.
The report suggests that one-way and two-way home information systems ... will penetrate deeply into daily life, with an effect on society as profound as those of the automobile and commercial television earlier in this century.
It conjured a vision, at once appealing and threatening, of a style of life defined and controlled by network terminals throughout the house.
As a consequence, the report envisioned this kind of American home by the year 1998: ''Family life is not limited to meals, weekend outings, and once a-year vacations. Instead of being the glue that holds things together so that family members can do all those other things they're expected to do - like work, school, and community gatherings -the family is the unit that does those other things, and the home is the place where they get done. Like the term 'cottage industry,' this view might seem to reflect a previous era when family trades were passed down from generation to generation, and children apprenticed to their parents. In the 'electronic cottage,' however, one electronic 'tool kit' can support many information production trades.''...
... The report warned that the new technology would raise difficult issues of privacy and control that will have to be addressed soon to ''maximize its benefits and minimize its threats to society.''
The study ... was an attempt at the risky business of ''technology assessment,'' peering into the future of an electronic world.
The study focused on the emerging videotex industry, formed by the marriage of two older technologies, communications and computing. It estimated that 40 percent of American households will have internet service by the end of the century. By comparison, it took television 16 years to penetrate 90 percent of households from the time commercial service was begun.
The ''key driving force'' controlling the speed of computer communications penetration, the report said, is the extent to which advertisers can be persuaded to use it, reducing the cost of the service to subscribers.
''Networked systems create opportunities for individuals to exercise much greater choice over the information available to them,'' the researchers wrote. ''Individuals may be able to use network systems to create their own newspapers, design their own curricula, compile their own consumer guides.
''On the other hand, because of the complexity and sophistication of these systems, they create new dangers of manipulation or social engineering, either for political or economic gain. Similarly, at the same time that these systems will bring a greatly increased flow of information and services into the home, they will also carry a stream of information out of the home about the preferences and behavior of its occupants.'' Social Side Effects
The report stressed what it called ''transformative effects'' of the new technology, the largely unintended and unanticipated social side effects. ''Television, for example, was developed to provide entertainment for mass audiences but the extent of its social and psychological side effects on children and adults was never planned for,'' the report said. ''The mass-produced automobile has impacted on city design, allocation of recreation time, environmental policy, and the design of hospital emergency room facilities.''
Such effects, it added, were likely to become apparent in home and family life, in the consumer marketplace, in the business office and in politics.
Widespread penetration of the technology, it said, would mean, among other things, these developments:
- The home will double as a place of employment, with men and women conducting much of their work at the computer terminal. This will affect both the architecture and location of the home. It will also blur the distinction between places of residence and places of business, with uncertain effects on zoning, travel patterns and neighborhoods.
- Home-based shopping will permit consumers to control manufacturing directly, ordering exactly what they need for ''production on demand.''
- There will be a shift away from conventional workplace and school socialization. Friends, peer groups and alliances will be determined electronically, creating classes of people based on interests and skills rather than age and social class.
- A new profession of information ''brokers'' and ''managers'' will emerge, serving as ''gatekeepers,'' monitoring politicians and corporations and selectively releasing information to interested parties.
- The ''extended family'' might be recreated if the elderly can support themselves through electronic homework, making them more desirable to have around.
... The blurring of lines between home and work, the report stated, will raise difficult issues, such as working hours. The new technology, it suggested, may force the development of a new kind of business leader. ''Managing the complicated communication in networks between office and home may require very different styles than current managers exhibit,'' the report concluded.
The study also predicted a much greater diversity in the American political power structure. ''Electronic networks might mean the end of the two party system, as networks of voters band together to support a variety of slates - maybe hundreds of them,'' it said.
Now read this article on using software bots (not robots, contrary to the title) to shape and control social networks and opinions and two recent posts of mine on the state of blogging.
So, did the Institute for the Future get it right - or not?
I would say they did quite well, though they are more right about 2011 than about 1998. I didn't think so at first, because they used words like "videotext" and "teletext". They sound silly because we still do very little with telepresence or videoconferencing -- contrary to the expectations of the last seventy years.
On careful reading though, it was clear what they called "teletext and videotext" was approximately "email and rich media communications". So I substituted the words "computer", "internet" and "networked systems" where appropriate. Otherwise I just bolded a few key phrases.
Rereading it now they got quite a bit right. They weren't even that far off on home penetration. They also got quite a bit wrong. The impact on politics seems to have contributed to polarization rather than diversity. Even now few elders use computer systems to interact with grandchildren, and none did in 1998.
So, overall, they maybe 65% right, but about 10 years premature (on a 16 year timeline!). That's now awful for predicting the near future, but they'd do even better to follow Charle's Stross prediction rules ...
The near-future is comprised of three parts: 90% of it is just like the present, 9% is new but foreseeable developments and innovations, and 1% is utterly bizarre and unexpected.
(Oh, and we're living in 2001's near future, just like 2001 was the near future of 1991. It's a recursive function, in other words.)
However, sometimes bits of the present go away. Ask yourself when you last used a slide rule — or a pocket calculator, as opposed to the calculator app on your phone or laptop, let alone trig tables. That's a technological example. Cultural aspects die off over time, as well. And I'm currently pondering what it is that people aren't afraid of any more. Like witchcraft, or imminent thermonuclear annihilation....