Showing posts with label anthropology. Show all posts
Showing posts with label anthropology. Show all posts

Wednesday, May 18, 2011

Conversations - From emotional confrontation to dialog

I'm back from a two day corporate class on VitalSmarts Crucial Conversations (see also: Amazon reviews of the book). I'm going to summarize here what was new to me, and what I'm going to do differently in my personal and corporate life. This is how I process new material, please feel free to skip this post if this material isn't your tea cup.

I'm not going to review or recap the original book by Paterson et al. I skimmed the book and I wasn't impressed. I was, however, pleased with the VitalSmarts "Participant Toolkit", with their educational materials, and with our instructor.

I'm also not going to recapitulate the course. This is a summary of my personal interpretation and transformation of the course material including my own experience and readings. I particularly recommend the complementary book Bidell's Three Steps to Yes. In some respects this post may contradict the course work, in others it extends the material.

Before I begin, I can't resist some cultural context. It's impossible to read this book, with its model of "silence" and "violence" as two styles of aggressive conversation, without thinking of "female" and "male". Among other things, this material can be read as a guide to communication in a multi-gender corporate hierarchy. There are limits to this interpretation of course. Like many geeks of either gender my "style" score was silence/violence balanced, with a bias to "silence". (Important note: "violence" in this context is verbal - sarcasm (attack), verbal control, and verbal labeling. It's an interesting choice of label.)

The concept of a "crucial" conversation is novel and meaningful. There are three ingredients, but one is particularly critical. The first two ingredients are high stakes and conflict. The third and most critical ingredient is (negative) emotion. The primary focus and value of the "Crucial Conversation" (CC) methodology is managing the emotional component of important conversations. The goal is to transform a high-emotion interaction to a low emotion "dialog".

CC, therefore, is not a good thing (I think the course materials are confused about this). The "good thing" is a productive and positive dialogue. A "CC" is, at best, a means to getting to dialogue. At worst, it's the result of a botched interaction, and a means to get things back on track.

The first goal of the training is to be able to recognize when a Conversation goes Crucial (think Plutonium going Critical). The most important response, at that point, is to give up on the topic of conversation and focus on managing the emotional component. That's a big idea for me. Especially on a phone conference I've planned, I'll miss or disregard the emotional component and try to power through my original agenda.

Speaking of a phones, this is where the course is partly obsolete. It was written for the 1990s world of person-to-person communication. In the post-Great Recession world our corporate interactions are by telephone conference. No, not telepresence or videoconference -- 1970s style teleconference. Person-to-person emotion management is hard enough, but after years of being beaten and pummeled I can just about manage it. Managing emotions on a multi-person half-the-team-is-muted conference call takes things to a whole 'nother level.

So how can we modify the approach of the course to a setting where emotions can rise fast on a teleconference with people we may see every few years? The approach I'm going to test goes like this:

  1. Identify signs of emotional intensity. Conspicuous silence or tone of voice are the best remote cues. I can monitor my own responses, such as rapid heart rate, sweating or increasingly slow speech -- but those are late arrivals. I think an early sign of emotional content, for me, is narrowing of my eyes. I'm training myself to detect that and even to forcibly relax my eye muscles.
  2. Manage the immediate emotion. This may mean using techniques that CC considers dysfunctional -- such as avoiding and withdrawing. The goal is to get out of the call without an escalation.
  3. Schedule a managed one-on-one "CC" -- by phone usually (alas). (Telepresence is better, and with advance planning may be available). Scheduling the one-on-one "CC" gives time to work on the "path" script of Fact definition, scripted Story, and Ask questions. The goal of the scheduled "criticality" is to get through emotion and back to "dialogue".

The training and course materials don't discriminate between a "planned" and "emergent" CC. That's a big distinction. It's the difference between running up hill and running through a mine field. Given where I am now, my personal goals are to recognize an "emergent" CC, calm it as much as possible (abandon agenda, get out of Dodge alive), and then plan a managed CC.

My outline of a managed CC borrows from CC and Bidell. It starts with a planning phase that's largely Bidell ...

  1. What is my goal for me and others? What is the true goal of the other person -- even though they may not know it themselves? (In Bidell's world, it's usually personal success in one form or another. That seems to work.)
  2. What is it I want to avoid?
  3. How can I achieve #1 and avoid #2?

Knowing the other person's true goal, and how that can be achieved, is the key to both Bidell's "Persuasion" and to creating CC's "shared purpose". That "shared purpose" may be to achieve success for the other person, even if their original conversation goal is not met. In Bidell's terms, find a way for my "Prospect" to "Win" -- while making the sale.

CC next focuses on the "do/don't" statement as a way to express my conversational goals. "I do want to get paid, I don't expect to get paid this week." It's not a bad place to start, but I can see how it might need modification.

The next phase in this structured high-emotion conversation is Fact/Story -- avoiding the dreaded "why" (the other banned word is "but"). Fact is supposed to be an enumeration of verifiable statements that will considered "true" by all participants. The Story comes next -- it's where the emotion and opinions come in. The Story is the statement of personal impressions, carefully refactored to avoid "violence" (sarcasm, control, labeling, etc), to avoid identifying a villain or a victim, and to avoid expressing helplessness.

For me, both the Facts and "the Story" are best written out beforehand and practiced aloud.

The Facts and Story are to be presented in an "tentative" and "testing" fashion (What have I left out? Does this sound right? What are your thoughts/feelings?).

The Story is followed by the "Ask". The goal here is to encourage the "Prospect" (Bidell's term) to follow a similar "Path" by asking framing questions and using classic conversational strategies such as  mirroring (I hear you say you're good, but as I imagine your face I think it's .... ok, so this doesn't work so well over the phone) and "paraphrasing".

At that point, if all (miraculously) has gone relatively well, the "Crucial Conversation" is done, and the action conversation (decisions, dialog, discussion) begins.

Or so the theory has it.

I'll be testing that out.

Friday, May 13, 2011

Separated at birth: alternative medicine and climate change denial

As a colleague and I corresponded about my support for the scientific consensus on CO2 driven climate change, I realized I was replaying fifteen year old conversations about the alternatives to science-based medicineHomeopathy consumers have a lot in common with cosmic ray climate enthusiasts.

One common thread is a skepticism about the value science, and particularly the value of the scientific establishment.

Some believe that science simply doesn't apply. "Healing fields", they say, cannot be detected by science; indeed scientific analysis may destroy them. Herbal remedies are safe because Nature loves us. Yahweh promised us the Earth, so it's impossible for us to render it (transiently) inhospitable.

This version of anti-science is uninteresting. These arguments can't be refuted for the same reasons that we can't disprove the existence of unicorns and leprechauns.  There's no measure for resolving disagreements; these are theological disputes.

Another form of argument grants that the scientific method is effective, but claims that the scientific establishment is corrupt and untrustworthy. This is more interesting because it's at least partly true. Over the past twenty years we've learned about the effects of publication bias, particularly when corporations with strong financial interests (ex: Pharma) control the publication of research results. We've seen some spectacular scientific frauds, and we've seen a trend to "me too" research that gets safe grants but produces small results. During the Bush years, we saw loyalists suppress scientific results their bosses disliked.

Alas, there's no evidence the amateurs are reliable; most seem driven by the same passions that power crank cosmologists. Even if they were angels, furthermore, by their nature these amateurs bypass scientific evaluation and challenge. They cannot be judged because they're not in the game.

Sure, the scientific program is imperfect, but, when it comes to understanding the world, there are no alternatives. The process of iterating on internally consistent models that make testable predictions, and revising those models when predictions fail, has transformed human history. It is the only guide we have to developing better medicines, understanding the universe, or predicting the consequences of CO2 accumulation.

The denialists do have a point, even if they don't fully recognize it. We can and should improve the machinery of science. Requirements to publish data obtained through public investments, registries of studies to ensure negative and unfavorable results are published,  and (more challenging) reforms to grant programs and academic tenure are some of the improvements seen over the past decade.

Science tells us Homeopathy's effects are mediated by belief, not molecules. Science tells us that CO2 accumulation will change the earth's climate; and that these changes will be extremely disruptive for a crowded planet with fixed borders.

Maybe in ten years science will tell us that solar cycles are more important for our 21st century climate than CO2 accumulation. Maybe science will tell us that spinal manipulations do change the immune system. Maybe, but probably not.

Update 5/14/11: I've rewritten parts of the first few paragraphs.

Saturday, January 15, 2011

What will be unacceptable in 50 years?

Ta-Nehisi Coates has been trying to imagine how, about 150 years ago, an American culture could celebrate the ownership of humans. That is unacceptable in America today. Times change.

Assuming we continued the historic trends of the past hundred years [1], what will be unacceptable fifty years from now?

I listed three ideas in a comment ...

Grappling With Genosha - Ta-Nehisi Coates - Personal - The Atlantic

.... What things do we love that will be despised in fifty years?

I eat pigs. I think that will be unacceptable in fifty years.

Coates loves football, a sport where brain damage is a normal outcome. I think that will be unacceptable sooner than fifty years from now.

We imprison and kill people with disabled brains. I think that will be unacceptable fifty years from now...

Any others?

[1] Of course this is simply a thought experiment. If the trends of the past century continue, humans will not be defining acceptability.

Sunday, January 09, 2011

Most cultures punish the atypically generous

All cultures punish "cheaters". That's unsurprising.

What's surprising is that most cultures punish the atypically generous ...

Balkinization

... the U.S, Australia and the U.K. subjects were much less likely to punish players who HAD cooperated by contributing to the group project. In other societies, ‘[m]any subjects engaged in anti-social punishment; that is, they paid to reduce the earnings of ‘overly’ cooperative individuals (those who contributed more than the punisher did).’ ...

Looking at the graph some cultures punish atypical generosity even more than "free riders". On visual inspection I see three grades of anti-social punishment (emphases mine) ...

  • Low levels: US, Australia, UK, Switzerland, China, Germany
  • Mid level: Denmark, Ukraine, Korea, Turkey, Russia (I suspect Japan would resemble Korea)
  • High level: Saudi Arabia, Greece, Oman

I suspect Canada would fall between Germany and Denmark, at the high end of low punishment. I grew up in Canada, where we understood it was rude to be exceptionally good in any way. I wonder if what's measured here is really a general response to being exceptional (talented, witty, generous, etc) rather than to a specific response to "excessive" generosity. It may also be that in some cultures there is a strong duty to reciprocate generosity (Japan?), so the generous act can be a bit of an unwanted gift.

I think these responses are important to understand -- particularly for those whose programming favors generosity. Even in the US the naturally generous will work with people from cultures that may resent or feel burdened by an unsolicited gift. It is often wise to balance gifts with requests, providing an opportunity for the recipient to balance the scales. Most of all it may be wise to do deed invisibly, so nobody will feel burdened and the Samaritan will go unpunished.

 

Monday, December 27, 2010

The history of post-neolithic humanity in 10 minutes - DeLong's annual Econ 1 post

Brad DeLong, my favorite economist, has published the latest edition of his annual Econ 1 Berkeley: September 29 2010 Economic Growth Lecture. It's his gift to the rest of us, and a fine gift it is [1]. This is why I love blogs.

In about ten minutes anyone can catch up on the most current synthesis of the past 12,000 years of human history; from the deep history of the Neolithic to modern IT and the rise of India and China. He stops just short of putting IT on the same level as the development of language -- too soon to tell.

That leaves unspoken the period from about 150,000 BCE to 10,000 BCE and especially 30,000 to 12,000 BCE. This is deep history, and 2010 has been a breathtaking exploration of the pre-neolithic. In just the last eight months we've learned we moderns are a mongrel mix of Denisovan, Neandertal and, probably a lot of other pre-neolithic human "breeds". Out of that churning mix came something astonishing, horrifying, and (we currently believe) completely new to the earth - the technocentric animal.

Exciting times.

See also:

Some of my stuff

[1] The next time I'm out SF way, I'm going to see if there's some way to sneak into a DeLong lecture. Maybe he sells tickets?

Thursday, October 21, 2010

The rational basis for climate change denialism

I consider this a respectable and rational basis for denying that the earth's climate is being significantly altered by human greenhouse gas emissions ...
Global Warning Skepticism in Tea Party - NYTimes.com
...A rain of boos showered Mr. Hill, including a hearty growl from Norman Dennison, a 50-year-old electrician and founder of the Corydon Tea Party.
“It’s a flat-out lie,” Mr. Dennison said in an interview after the debate, adding that he had based his view on the preaching of Rush Limbaugh and the teaching of Scripture. “I read my Bible,” Mr. Dennison said. “He made this earth for us to utilize.”...
I like this response. It's much less painful than reading right wing pseudo-science.

Mr. Dennison holds a set of religious beliefs. That belief set includes the understanding that God gave Man a planet to use as Man wishes, and He designed the planet so Man could not damage it. Therefore the scientific consensus on climate change is a fraud.

His reasoning is absolutely internally consistent. His conclusions follow directly from his premises. There is no response save to criticize his religious beliefs -- which is a rather sensitive topic.

I wish more Denialists were as honest as Mr. Dennison. I'm not being sarcastic. I think, at the core, this is what most Denialists believe but refuse to say.

Saturday, October 02, 2010

Why do corporations (firms) exist?

Economists used to wonder, from a theoretical perspective, why "firms" including companies, and especially large corporations, exist (aka theory of the firm). In 1937 Coase thought that while corporations didn't allocate labor and capitol as well as the market, this was offset by lower transaction costs.

Of course transactions costs in the net era are far less than in Coase's time, so this doesn't explain why corporations remain so entrenched.

This still seems like a valid question. Does knowledge work, in particular, scale all that well? Movies seem to be put together by loose coalitions of small to medium sized companies, why aren't more things done like that?

I suspect most people familiar with large corporations would agree that often the company seems much less than the sum of its parts. In particular, the absence of internal markets can make intra-company collaboration less efficient than market based collaboration. Corporations, on the inside, operate like the command economies of the Soviet Empire (or, for that matter, like today's China -- which is doing well for the moment).

I'm trying to put together a list of things that large corporations can do uniquely well. I wasn't at all impressed with the conventional "theory of the firm" list. Here's mine ...

  1. Act without the restraints of antitrust law. A large corporation can do many things that would require collusion to be done by smaller entities.
  2. Change laws, particularly accounting standards and tax laws, to favor large corporations and lower their cost of capitol. This creates a positive feedback loop where tax laws and accounting rules favor large corporations, which in turn influence laws and rules that favor large corporations and so on.
  3. Corporations can buy senators and lesser politicians, again without collusion.
  4. Corporations can engage in financial warfare, cutting off suppliers to smaller competitors, blocking access to capitol, and so on.
  5. Corporations can capture regulators.
  6. Corporations may be able to create and institute processes that allow them to do knowledge work with "average" knowledge workers instead of temperamental and expensive "stars". (I don't think this actually works, but a lot of effort is spent on this.)
  7. Corporations can buy A and above ratings from (corrupt) rating agencies.
  8. Once a corporation exists, it has an unusual ability to sustain itself even when its mission ends (like the inquisition)

Taking these items as a whole, it's apparent that once corporations are established, they are large and powerful enough to change their ecosystem to suit them. Rather like some primates.

I'll update my list as I get more ideas. Any suggestions?

See also:

My stuff

Other people's

Update 2/25/11: In a Krugman article I learn that Williamson won the Nobel in 2009 for work in the 70s on the theory of the firm. So Williamson extended Coase ...

Williamson argues that the firm is best regarded as a "governance structure," a means of organizing a set of contractual relations among individual agents. The firm, then, consists of an entrepreneur-owner, the tangible assets he owns, and a set of employment relationships ...

Personally I wasn't that impressed with the descriptions I read of Williamson's work, but Krugman likes it (emphases mine)...

Oliver Williamson shared the 2009 Nobel mainly because of his work on a question that may seem obvious, but is much less so once you think about it: why are there so many big companies? Why not just rely on markets to coordinate activity among individuals or small firms? Why, in effect, do we have a lot of fairly large command-and-control economies embedded in our market system?

Williamson answered this in terms of the difficulties of writing complete contracts; when the tasks that need to be done are complex, so that you can’t fully specify what people should do in advance, there can be a lot of slippage and strategic behavior if you rely on market incentives; in such cases it can be better to do these things in-house, so that you can simply tell people to do something a particular way or to change their behavior.

... there are times when it’s better to rely on central planning than to leave things up to the market...

Krugman's "Central planning" comment sent the usual suspects frothing mad. They've obviously never lived in a large corporation. I have. Krugman is spot on.

 

Tuesday, September 28, 2010

The cultural impact of the Pill - neuroendocrinology

Modern imaging methods show hormonal contraceptive use changes brain structures.

That's interesting. It means it's now probably safe to mention one of the most interesting papers I ever wrote. For obvious reasons it was quickly buried.

I was an itinerant Watson Fellow in early 1982, staying with a very generous USAID worker and his wife in Dakka, Bangladesh. I was basically a parasite, but somehow I got it into my head to write a paper on the sociocultural implications of widespread OCP use in Thailand.

The premise of my paper was simple. Different OCPs, and progesterone implants, where known to have different effects on mood. Testosterone biased OCPs had one set of effects, estrogen biased another set, progesterone yet another. It seemed obvious that if you gave these medicines to millions of women the sum of the individual mood changes might have social implications.

If you wanted to change a society in a certain direction, you might favor one OCP over another. I was keen on social engineering in those days. That was before I was drummed out of the Trilateral Commission [1], and before a subsequent social engineering paper almost ended my first year of medical school.

Needless to say, I never got any comments about my pill paper. I was remarkably obtuse at that age, but even I had a sense this was not a wise topic choice. If anyone read my paper, they would have torched it immediately.

I suspect, however, that I was right.

[1] Joke. Sort of.

Wednesday, September 22, 2010

Race and ethnicity: Minneapolis and St. Paul

Race and ethnicity: Minneapolis and St. Paul.

It's part of a Flickr set by Eric Fisher inspired by Bill Rankin's Chicago map. (Via Fast Company).

Where I live is very red dot (white), though my household is 40% sunburn resistant. The Chicago map is much more dramatic and interesting.

Saturday, September 11, 2010

We're crazy now. We were crazier forty years ago.

Limbaugh. Beck. Palin. Bachman. Pawlenty. Mosque madness. Burning Qu'rans. Marketarianism. Denialism. Birther. TrutherAmerican torture.

We're certifiable. It's not just 9/11 -- we elected Cheney and denied reason before that. It took 9/11 though, to really put us in asylum territory.

If you care about humanity, or your own family, it's a wee bit depressing. That's why I liked Graham Burnett's Orion article. It's ostensibly about dolphins, but it tells the story of a peculiar man in a peculiar time not so long ago...
A Mind in the Water | Orion Magazine

... who was Lilly? His early biography offers little hint of what would be his enduring obsession with the bottlenose. Taking a degree in physics from Caltech in 1938, Lilly headed off to study medicine at the University of Pennsylvania, joining the war effort as a researcher in avionics. An early photo shows him as a rakish young scientist, smoking a corncob pipe while tinkering with a device designed to monitor the blood pressure of American flyboys—a number of whom, in those days, were actually using surfacing cetaceans for strafing practice.

After the war, motivated in large part by contact with the pioneering brain surgeon Wilder Penfield, Lilly turned his hand to neuroscience, applying the era’s expanding array of solid-state electronic devices to the monitoring and mapping of the central nervous system. Eventually appointed to a research position at the National Institutes of Mental Health (NIMH), Lilly spent the better part of a decade conducting invasive cortical vivisection on a variety of animals, particularly macaques. In the spy-versus-spy world of the high Cold War, this kind of work had undeniably creepy dimensions. Manchurian Candidate anxieties about “forced indoctrination” and pharmacological manipulation of political loyalties peaked in the 1950s, and security establishment spooks (as well as a few actual thugs) hung around the edges of the laboratories where scientists were hammering electrodes into primate brains...
Calech alumni. Medical training in Pennsylvania. Went into the tech industry. That's way too close to my life.

There are other intersections. I loved dolphins as a child; I'm sure I read his 1960 Man and Dolphin -- or at least the derivative works. (I was born in 1959, but in those days books lasted a long time in public libraries.)

Lily was genuinely crazy, but, as  Burnett reveals, so was his time.

This may come as a surprise to some. My generation has been keeping the 1970s in the attic, pretending it never happened. We got rid of all the books and most of the movies (the early music  we kept). We had lots of help -- everyone from that time has something to hide. The 1960s made a good distraction.

It's been forty years though. There are curious adults alive today with nothing to hide. They're going to start poking around the attic.

They'l find that the 1970s were seriously crazy. Yeah, America's nuts now, but, the good news is, we were at least as crazy then.

Sunday, August 22, 2010

The Corporation - what next?

In my seventh year within the fascinating, feudal, emergent machinery of a classic publicly traded corporation I volunteered to write a white paper about supporting internal collaboration for shared services. I wrote a post in 2008 asking about examples of systems to enable such collaboration.

I can't share the final paper here, but the conclusions were unsurprising. I think they are true of any corporation of significant size.

In the absence of internal markets, contracts, and currencies, true corporate collaboration requires either accounting system reorganization, or serious executive pressure, or some sort of baby-sitting coop style internal currency. All of these things are very hard to do; ironically collaboration outside the corporation is easier (see also - outsourcing) [3]. For example, executive power, like Presidential power, is a limited currency that must be used sparingly. [1]

I felt when I wrote the paper that I was walking old ground, but my real expertise is in health care and more esoteric domains. I didn't know how to follow this trail.

Later I learned I was intersecting the path of Ronald Cause and his 1937 paper The Nature of the Firm [3]. Alan Murray describes the paper in a recent WSJ article on the future of the corporation ...
The End of Management - Alan Murray - WSJ.com
... British economist Ronald Coase laid out the basic logic of the managed corporation in his 1937 work, "The Nature of the Firm." He argued corporations were necessary because of what he called "transaction costs." It was simply too complicated and too costly to search for and find the right worker at the right moment for any given task, or to search for supplies, or to renegotiate prices, police performance and protect trade secrets in an open marketplace. The corporation might not be as good at allocating labor and capital as the marketplace; it made up for those weaknesses by reducing transaction costs.
Mr. Coase received his Nobel Prize in 1991—the very dawn of the Internet age. Since then, the ability of human beings on different continents and with vastly different skills and interests to work together and coordinate complex tasks has taken quantum leaps. Complicated enterprises, like maintaining Wikipedia or building a Linux operating system, now can be accomplished with little or no corporate management structure at all...
I wasn't quite following Ronald Coase however; I was intersecting him. Seventy years after his paper was published, I came from a world where intra-corporate transaction costs were higher than extra-corporate costs. By 2007 collaboration within a typical large corporation had become more difficult than similar collaboration with an external agent.

Why did this happen? That's a rather interesting question. I expect there are publications on it, but they'd be hard for me to find. I wouldn't be be surprised if the costs of intra-corporate transactions are higher in 2007 than in 1937, and that the costs of extra-corporate transactions are substantially lower. The balance has shifted.

So why does the Corporation persist?

Well, for one thing, entrenched institutions are like cities of the northeast or trees tangled in the tropical canopy. They don't collapse overnight just because their sustaining systems are gone. The publicly traded corporation is deeply embedded in American law (including taxation law), accounting standards, international treaty, and politics (senatorial ownership). It's going to be around for decades to come.

Beyond mere inertia, however, Corporations are awfully good at economic warfare; a mode of operation more in the province of Macchiavelli's The Prince than standard economics texts [4]. Microsoft was once the master of this economic warfare, Intel still is. This mode of operation actually destroys customer value, but it's not going away.

Even though the 20th century Corporation will persist, but better and for worse, it's clear we're in one of those times of cranky dissatisfaction where the ancient Monster of the Market is looking vulnerable. We can at least hope there will alternatives.

Murray's sources don't know what those alternatives will be, and they seem reluctant to speculate. His prescriptions are a rehash of the usual management book pap - "agile, flexible, ruthless, cut their losses, lots of bets, Google [5], inspire, entrepreneurs, push decision-making down, wisdom of crowds, feedback, change, innovation, adaptability" , blah-blah-blah.

For my part I've been looking for good speculation since 2006, and I haven't come across much. I wrote up some of my thoughts last June, and some speculations by Iain Banks yesterday. I'm behind on reading Clay Shirky's 2008 book, I suspect I'll have some follow-up posts when I do that.

I'm guessing that we'll somer interesting variations emerge over the next decade. Some of them will resemble Apple (Singaporean model of the brilliant tyrant in what's effectively a public-private corporation), some Google (natural selection - creates sharks and tapeworms), and some may come from China (what is Foxconn [6]?).

I'm hopeful that within a decade I'll be able to invest in privately held companies where the owners have major organ systems in the game. I wouldn't mind working for or owning a part of one of those companies.

Interesting times.

See also

Gordon's Notes
Others
Footnotes
[1] I concluded that any collaboration must be informal. This can work because are many employees will, for a minuscule amount of recognition and commendation, happily share their work. (Unless their private knowledge becomes job security - which is a bit of a big caveat.)
So the question becomes how best to enable informal networks of internal collaboration at a time when personal connections within corporations have greatly weakened.
If I were (heaven forfend) running a publicly traded company I would require my IT department to choose a network sharing environment that supported search and discovery, and I'd train people to use it. I think you could actually do this on a large scale using an improved version of Microsoft SharePoint wiki (the rest of SP is an unredeemable disaster) and its companion search and discovery services.
This is hard stuff to do in hard times of course, and in easy times it does not seem necessary.
[3] Yes, all these Wikipedia links do have special meaning in this context.
[4] Christiansen's original Innovators Dilemma is one of the very few business books worth reading; his follow-up books are not nearly as good. Machiavelli's The Prince is still the champion though.
[5] An unfortunate example considering the tragic mess they've made of so many of their projects. Apple is conspicuously absent from the list of examples. As always, omissions are interesting.
[6] And why is its english Wikipedia entry so very brief?

Update 8/22/10: I rewrote my original post after I'd thought about it for a while.

Saturday, August 21, 2010

How we know humans have not yet adapted to the digital world

I still often read that "young people" are naturally "digital".

Writing as a definitely-not-young person I don't see it. I've never seen it.

I did see, before phones had qwerty keys, that teens were very good at texting using numeric keypads. In the later tactile qwerty era they were very good at typing with two thumbs. Young motor cortices and cerebella just pound geezer brains - even if we don't mention gaming. (In the iPhone era, however, the keyboard advantage seems more modest.)

What haven't seen are major improvements in the abstract domain of interacting with knowledge, information, and early AI systems. If young people were naturally digital then sheer demand would force Gmail to let us edit subject lines. Hasn't happened.

They are certainly not better programmers than the geezers I know. Coding used to be something of a young man's game, but now it takes so long to learn that it's becoming almost a middle-aged game.

Our brains evolve much more quickly than we used to think, but they don't switch tracks in a generation. Computer adaptation is cultural, and we're still in the early stages. We have a lot of cultural evolution to go through before we get maximal benefit from the IT infrastructure of even 2010 -- much less the infrastructure of 2020.

Which means that we'll be in a whitewater world of technology transition for decades to come.

Wednesday, August 18, 2010

Iraq, Afghanistan, Taliban and Patriarchy

Family and friends of a young Afghan couple stoned them to death. Did the girl's father throw a stone? Will his sleep be forever tortured?

Humans are revolting. We should really start over with dogs.

Pending a better sentient animal, however, while stonings, whippings, and mutilations of women are momentarily in the public eye, it is worth resurrecting the old question ...
What fraction of the Talib/Afghan/Saudi/Wahabi cross-gender resistance to the American agenda comes from a desire to maintain extreme patriarchy and male power?
There are, of course, many reasons for anyone to resist the (fungible) American agenda, and many reasons to question the value and expression of that agenda. Those are good discussions, but they don't change my question. (Note that the resistance I'm referring to is cross-gender; women in these cultures are often strident supporters of their current cultural milieu.)

My guess is that maintenance of patriarchy is the strongest single reason for resisting American interventions including education and health care. If we could quantify contributing factors, I bet 2/3 would fall under patriarchy.

I wonder if there's any way to objectively measure this. It might be important to understand, as we enter Year 10 of the Long War.

Sunday, July 04, 2010

Longevity - Homo neandertalis and technicalis

In a world where climate oscillated violently*, and humans wrestled with large animals, was male longevity not an evolutionary priority?

BBC - Radio 4 - Melvyn Bragg - In Our Time - The Neanderthals
... If you were a 30 year old Neanderthal, you were a very old man indeed...

Today, in terra technicalis, at least one population of Homo technicalis has a biological life expectancy of 88. Given what we know about how quickly humans evolve, has our aging rate changed?

Today a 30 yo male today in not remarkably less fit than a 20 yo male; marathons are often won by "older"men. Some loss of strength and healing speed is more than offset by experience. So why did earlier humans die so young?

Maybe we do age more slowly, but if we assume a 20% annual mortality rate among active hungers only (.8**20) *100% or 1.2% will live 20 years. So maybe they just led very dangerous lives ...

* What we have thought of as the "ice ages" might be better thought of as the "age of the chaos climate".

Friday, July 02, 2010

How quickly can humans evolve?

Even fifteen years ago cognitive science courses taught that the human mind was frozen in the Paleolithic Pleistocene. Humans didn’t evolve any more

Now we wonder how fast can humans evolve ...

Scientists Cite Fastest Case of Human Evolution – Nicholas Wade - NYTimes.com

…. Comparing the genomes of Tibetans and Han Chinese, the majority ethnic group in China, the biologists found that at least 30 genes had undergone evolutionary change in the Tibetans as they adapted to life on the high plateau. Tibetans and Han Chinese split apart as recently as 3,000 years ago, say the biologists, a group at the Beijing Genomics Institute led by Xin Yi and Jian Wang. The report appears in Friday’s issue of Science.

If confirmed, this would be the most recent known example of human evolutionary change. Until now, the most recent such change was the spread of lactose tolerance — the ability to digest milk in adulthood — among northern Europeans about 7,500 years ago. But archaeologists say that the Tibetan plateau was inhabited much earlier than 3,000 years ago and that the geneticists’ date is incorrect.

When lowlanders try to live at high altitudes, their blood thickens as the body tries to counteract the low oxygen levels by churning out more red blood cells. This overproduction of red blood cells leads to chronic mountain sickness and to lesser fertility — Han Chinese living in Tibet have three times the infant mortality of Tibetans…

This is vicious selection; in pre-technological times the infant mortality gap was probably even greater.

Which reminds me of something I wrote two months ago

… Even after the development of agriculture and writing we see thousand year intervals of relative stasis in China, Egypt and Mesopotamia. How could this be when our fundamental technologies change in decades. Are the minds of modern Egyptians radically different from the minds of only 6,000 years ago? Why? Why do we see this graph at this time in human history?

What did humans do in Georgian caves for 30,000 years? Thirty thousand years of waving and sewing and nothing changes?! They could not have had the same brains we have. They seem more … Neandertal…

Six thousand years is twice the time it took humans to adapt to the Tibetan plateaus. So that’s plenty of time for brains to change.

Except brains are qualitatively different from red cells. Brains are a platform for minds. Left handed people flip hemispheric specializations, and yet seem to think very much like right handed people.

Think about that. Mutations that flip cardiac orientation are 100% lethal. Flipping hemispheres though – the mind adapts. People born with half a brain can function in human society. Five percent of the population have big ugly looking mutations in brain development systems – yet they seem fine.

The human mind can run similarly on a diverse infrastructure. The software analogy is irresistible. A browser running on an iPhone can look and act a lot like one running on a Win 7 box – but the two systems are very different.

This gives a lot more leeway to evolution. It means that the ‘variation controls’ on the genetic programs for neural development can be “set” (by evolution) to “high variation” – and we can still turn out functioning humans minds. It means that brains may be evolving very quickly – over the course of a thousands of years.

It will be interesting to compare the DNA of Homo sapiens 2000 BCE with Homo sapiens 2010 ACE.

See also

Update 7/20/2010: John Hawks reviews the evidence for active selection. I think when he talks about "demographics" he might be talking about how the unification of dispersed human populations causes new phenotypes to emerge -- but he's tip-toeing around something and is being cryptic.

Sunday, June 13, 2010

Strongest sign of economic recovery

The mass flow "investor" is buying gold ... Worried About Their Dollars, More Are Turning to Gold - NYTimes.com.

Forget every other indicator. This one rules. The recovery is on.

Why dog haters should love prosecution of animal cruelty

Unsurprisingly, people who abuse animals are also dangerous to humans ...
The Animal-Cruelty Syndrome - NYTimes.com:
... significant reason for the increased attention to animal cruelty is a mounting body of evidence about the link between such acts and serious crimes of more narrowly human concern, including illegal firearms possession, drug trafficking, gambling, spousal and child abuse, rape and homicide...
... In his famous series of 1751 engravings, “The Four Stages of Cruelty,” William Hogarth traced the life path of the fictional Tom Nero: Stage 1 depicts Tom as a boy, torturing a dog; Stage 4 shows Tom’s body, fresh from the gallows where he was hanged for murder, being dissected in an anatomical theater. And animal cruelty has long been recognized as a signature pathology of the most serious violent offenders. As a boy, Jeffrey Dahmer impaled the heads of cats and dogs on sticks; Theodore Bundy, implicated in the murders of some three dozen people, told of watching his grandfather torture animals; David Berkowitz, the “Son of Sam,” poisoned his mother’s parakeet....
I am surprised this hasn't gotten more formal research attention attention in the past. Perhaps scholars assumed it was self-evident? Formal investigations are now confirming long held beliefs. That's good research -- not all long held beliefs are empirically supported.

Not everyone loves dogs and cats. Maybe they have something against brood parasites. Even so, these dog-dislikers have good reason to favor aggressive investigation and prosecution of animal cruelty. My dog is their canary.

Thursday, May 20, 2010

Civilization is stronger than we think: Structural deficits and complex adaptive systems

The more humans you know, the harder it is to imagine that civilization can endure. Billions of consumers. Environmental collapse. Climate change. Peak Oil. China's gender wars. The falling cost of havoc. The GOP. Skynet, sooner or later.

It looks hopeless, but on the other hand it's been 58 years since the first fusion weapon was detonated - and we're still here. That's surprising.

It's not just technology that we've survived. It seems impossible that democracies can manage their finances, but they do ...

Adam Smith's Money World - Onc is not Enough

... Greece has its debt bail-out, or appears to have, but there’s still that riot-inducing issue of government budget cuts. Is it even feasible for a government to cut its budget by as much as the International Monetary Fund has demanded of Greece? Yes it is very possible -- all too possible, in fact -- according to the IMF’s own study. In the past three decades there have been at least nine instances in which developed nations have cut their structural deficits by at least 10 percent of GDP...

It's true that some nations do better than others, but it's impressive that, faced with doom, even troubled nations like Greece and the US draw back. For example, to our great shame we reelected George W Bush and Richard Cheney. We did not, however, elect John McCain (now sadly demented) and Sarah Palin.

How does reason emerge from chaos?

We don't know, but many suspect it has something to do with the properties of a complex adaptive system. In our case it's a system built of economics and politics and the noise of the disconnected and, perhaps, the cumulative influence of the rational individual. It's a system that is self-sustaining, a system that "wants to live".

The system is hard to measure, but it's strong. It's also a fractal response -- just as civilization is surprisingly robust, so too are its components. Consider the digital economy. Perfect, near zero cost replication was very disruptive -- but the systems is responding. The iPad, the Flash wars, Google TV, "curated computing" -- it's all about the system responding to the disruption. It's all about the Digital Rights Management (DRM). Of which I will say more ...

--My Google Reader Shared items (feed)

Monday, May 17, 2010

Krugman discovers humans are not rational

Paul Krugman is a fan of behavioral economics. He’s also fabulously well read, he must have read some anthropology, history, and political science at some point in his life. At heart though, Krugman is an economist. It’s hard for an economist to escape the prejudice that humans are fundamentally rational self-interest optimizers. It’s baked into their culture.

Alas, humans are only partly rational part of the time*. Obama, like every politician, knows this in a deep way. That’s why he ignores Krugman’s political advice.

Krugman can learn though. I’ve read him religiously since he became a byte-stained wretch, and he’s changing. He’s learning politics (emphases mine) …

Krugman - The G.O.P. - Going to Extreme - NYTimes.com

… Right-wing extremism may be the same as it ever was, but it clearly has more adherents now than it did a couple of years ago. Why? It may have a lot to do with a troubled economy.

True, that’s not how it was supposed to work. When the economy plunged into crisis, many observers — myself included — expected a political shift to the left. After all, the crisis made nonsense of the right’s markets-know-best, regulation-is-always-bad dogma. In retrospect, however, this was naïve: voters tend to react with their guts, not in response to analytical arguments — and in bad times, the gut reaction of many voters is to move right.

That’s the message of a recent paper by the economists Markus Brückner and Hans Peter Grüner, who find a striking correlation between economic performance and political extremism in advanced nations: in both America and Europe, periods of low economic growth tend to be associated with a rising vote for right-wing and nationalist political parties. The rise of the Tea Party, in other words, was exactly what we should have expected in the wake of the economic crisis…

Better late than never. The new Krugman will be even more interesting than the old one was.

* I suspect on average, over time, the system in which we are embedded is more rational than it seems, but that’s another post. (Yes, sounds like “psychohistory”, and, yes, Krugman, like me, grew up on Asimov.)

Friday, May 14, 2010

Identity: Legion is a character defect?

Last February I wrote The Buzz profile problem: I am Legion.

It surprised me that I had to write the post. I thought it was self-evident that adults have many identities. Google's Buzz flop made me realize I was wrong. Obviously a lot of Googlers missed the obvious.

Google may be catching on. Not so Facebook's master - Mark Zuckerberg ...
An Internet Where Everyone Knows You’re a Dog — Crooked Timber

...While searching for evidence of Zuckerberg’s broader philosophy of information, a passage from David Kirkpatrick’s forthcoming book, The Facebook Effect, is quoted:
“You have one identity,” he emphasized three times in a single interview with David Kirkpatrick in his book, “The Facebook Effect.” “The days of you having a different image for your work friends or co-workers and for the other people you know are probably coming to an end pretty quickly.” He adds: “Having two identities for yourself is an example of a lack of integrity.”

Zuckerberg is famously young, and famously wealthy. He has not had to grow up; he may never have to grow up.

Adults have complicated lives. Adults have parents, and children and grandchildren, patients and students, employers and colleagues and staff, friends and neighbors. Adults live in a crowded world where wisdom and compassion means muting the self, juggling the complexity of contextual identity. What we used to call, in medical school, being professional.

Zuckerberg is not an adult. I know where he's coming from. As an aspergerish teenager I might have made the same mistake.  He'll likely grow up one day and realize he goofed.

Problem is, we can't wait. He's rich enough that growing up may take a very long time, and for that time he'll be running Facebook.

I'm winding down my Facebook presence; I'll let it die a natural death. If Google or someone else provides a smarter alternative, I'll encourage friends and family to switch.