Friday, May 23, 2008

Why is corporate IT so bad? Because CEOs don't like IT.

Much of my life is spent in the world of the large publicly traded corporation.

It's a curious world. I never aimed to be here, but my life is much more like a ship in a storm than an eagle on the wind. I washed ashore and have lived among these peculiar natives for many years. I have learned some of their mysterious rituals and customs, and I seem to them more odd than alien.

There are many things I could say about large corporations, which I think of as a mix between the worlds of European feudalism, the command economies of the Soviet empire, and the combative tribal cultures of New Guinea. From yet another perspective the modern corporation is an amoeba oozing across an emergent plain of virtual life, a world in which humans do not exist and multi-cellular organisms are still in the future.

But I digress.

One the peculiarities of modern corporate life is how awful the essential IT infrastructure usually is (70% plus in Cringely's unscientific polling). Electricity, phones and heat aren't too bad, but corporate IT systems are a mess.

Broadly speaking, corporate IT infrastructure feel about 30-40% under-funded, in part due to an inevitable dependency on the Microsoft platform with its very high cost of ownership. Even if IT infrastructures were fully funded, however, there would remain a near universal lack of measurement of the impact of various solutions on employee productivity.

Why is this?

Cringely tries to answer this question. I think he's close to the right track, but he's distracted by focusing on management expertise and other peripheral issues. I think the answer lies on a related dimension. First, Cringely ...

I, Cringely . The Pulpit . IT Wars | PBS

Last week's column on Gartner Inc. and the thin underbelly of IT was a hit, it seems, with very few readers rising to the defense of Gartner or the IT power structure in general... the bigger question is why IT even has to work this way at all?

... Whether IT managers are promoted from within or brought from outside it is clear that they usually aren't hired for their technical prowess, but rather for their ability to get along with THEIR bosses, who are almost inevitably not technical...

... The typical power structure of corporate (which includes government) IT tends to discourage efficiency while encouraging factionalization. Except in the rare instance where the IT director rises from the ranks of super-users, there is a prideful disconnect between the IT culture and the user culture...

...In time this will end through the expedient of a generational change. Old IT and old users will go away to be replaced by new IT and new users, each coming from a new place...

It's kind of a chaotic column really (perhaps because it was written on an iPhone!), the above editing is showing just the bits I thought were interesting. From these excerpts you can see that Cringely is, in part, taking a sociological perspective. I think that's the right approach, one that consider the age of today's senior executives and the world they grew up in.

In essence, the senior executive leadership of most corporations are not dependent on IT in any significant way, and they tend to have a substantial (often justified) emotional distrust for computer technology in general. It is, to them, an alien and unpleasant world they're rather forget about. They don't use the IT systems that drive their employees to drink, and quickly they forget about them.

For this group corporate IT infrastructure is a mysterious expense, with unclear returns.

It is not surprising that the IT world, then, is the problem child in the attic. It will take a generational change to fix this, so we'll be living with the problem for another twenty years...

Finding blogs - the cult of In Our Time

I really don't know that many In Our Time fans.

That puzzles me. It seems everyone ought to be listening to Lord Melvyn Bragg and company on their daily commutes. I've sent out a few starter DVDs, but I don't believe I've created any compulsive listeners.

It must be a rare mutation.

On the other hand, it's a big world. There may be hundreds, nay, thousands of cultists.

Once we would have had to rely upon a secret handshake, or a tie worn a certain way, but now there are other ways for cultists to find one another.

We can search - "In our Time" bragg - Google Blog Search.

That's quite a good list of blogs for me to explore. Now if those fellow fans would like to join one of those newfangled social networking thingies ...

IOT does The Black Death

This one will be available as a podcast for 3-4 more days: BBC - Radio 4 In Our Time - The Black Death. It should be superb; it plays to Melvyn's strengths. Get it now before it goes streaming.

I'm a fan, of course.

Thursday, May 22, 2008

Irena Sendler: Read this.

The obituary of Irena Sendler, dead at 96 yeara old.
Irena Sendler | Economist.com:

... That bureaucratic loophole allowed her to save more Jews than the far better known Oscar Schindler. It was astonishingly risky. Some children could be smuggled out in lorries, or in trams supposedly returning empty to the depot. More often they went by secret passageways from buildings on the outskirts of the ghetto. To save one Jew, she reckoned, required 12 outsiders working in total secrecy: drivers for the vehicles; priests to issue false baptism certificates; bureaucrats to provide ration cards; and most of all, families or religious orders to care for them. The penalty for helping Jews was instant execution.

To make matters even riskier, Mrs Sendler insisted on recording the children's details to help them trace their families later. These were written on pieces of tissue paper bundled on her bedside table; the plan was to hurl them out of the window if the Gestapo called. The Nazis did catch her (thinking she was a small cog, not the linchpin of the rescue scheme) but did not find the files, secreted in a friend's armpit. Under torture she revealed nothing. Thanks to a well-placed bribe, she escaped execution; the children's files were buried in glass jars. Mrs Sendler spent the rest of the war under an assumed name...

AT&T - Saint Paul is NOT a part of Minneapolis!

The good news is that we live deep in AT&T 3G network coverage. This will be important after iPhone 2.0 comes out on June 9th.

The bad news is that AT&T's MN coverage listing includes Minneapolis, but not Saint Paul.

Apparently, they think Minneapolis includes Saint Paul.

There is no greater crime in these parts than to think Minneapolis is the whole of the Twin (as in two) Cities. This is worse than treating the Bronx as part of Manhattan, or conflating San Francisco and San Jose.

Someone needs to write AT&T a letter!

General Sanchez: Abu Ghraib was made in the White House

Lt. Gen. Ricardo Sanchez was disgraced by the black hole of Abu Ghraib. He commanded in Iraq at that time.

In a recent book, he tells us Abu Ghraib was born in the White House:
Torture Trail - Intel Dump - Phillip Carter on national security and the military.:

... Because of the U.S. military orders and presidential guidance in January and February 2002, respectively, there were no longer any constraints regarding techniques used to induce intelligence out of prisoners, nor was there any supervisory oversight. In essence, guidelines stipulated by the Geneva Conventions had been set aside in Afghanistan -- and the broader war on terror. The Bush administration did not clearly understand the profound implications of its policy on the U.S. armed forces.

In essence, the administration had eliminated the entire doctrinal, training, and procedural foundations that existed for the conduct of interrogations. It was now left to individual interrogators to make the crucial decisions of what techniques could be utilized. Therefore, the articles of the Geneva Conventions were the only laws holding in check the open universe of harsh interrogation techniques. In retrospect, the Bush administration's new policy triggered a sequence of events that led to the use of harsh interrogation tactics not only against al-Qaeda prisoners, but also eventually prisoners in Iraq -- despite our best efforts to restrain such unlawful conduct...
Tired of thinking about American torture? Get used to it. Historians will be talking about this for the next fifty years. Your children and grandchildren will read about it in school.

Wednesday, May 21, 2008

You're not really forgetful. You're just more aware ...

Yes, and you're getting handsomer too.

Can I interest you in some Florida real estate?
Memory Loss - Aging - Alzheimer's Disease - Aging Brains Take In More Information, Studies Show - Health - New York Times

When older people can no longer remember names at a cocktail party, they tend to think that their brainpower is declining. But a growing number of studies suggest that this assumption is often wrong.

Instead, the research finds, the aging brain is simply taking in more data and trying to sift through a clutter of information, often to its long-term benefit...
I confess, I made a rude noise when I read this one. I'm just glad I wasn't drinking at the time -- could have been hard on the ol' laptop.

There ain't no way my brain is improving with age!

It's a nice dream though. There are worse things than denial ... :-).

Wretched success: How IE 4 killed Microsoft's control of the net

It was a strategy that worked wonderfully -- for a while. Really, it ought to have worked forever.

When Microsoft killed Netscape with IE 4 (3?), they used every trick in the old playbook. In particular, they created a set of proprietary extensions to web standards, then baked them into IE server and web application toolkits.

Soon intranet applications were IE only. Many public web sites were also IE only of course, but in the corporate world penetration was 100%.

Why use one browser at work and another at home?

IE took over, Netscape died.

Then history took a strange turn. Google and Yahoo rose just as Phoenix/Firebird/Firefox was struggling to be born. Apple, implausibly, reappeared with a version of IE that wasn't quite the same as the XP version (Safari came later). Microsoft had serious competitors who were motivated to support an alternative to IE. It became possible to get public work done using Firefox. Security vulnerabilities in IE 5 made it a poor choice on the pubic net. A critical mass of geeks began using Firefox at home, though they still had to use IE at work.

IE 6 came out and corporate apps mostly worked with some tweaks. The browser security issues remained, however. IE 6 was still signficantly inferior to Firefox and it continued to lose market share.

Microsoft felt obligated to introduce Internet Explorer 7 -- a quite fine browser that, for reasons that Microsoft may now deeply regret, had to be significantly different from IE 4, 5 and 6. In particular, it had to be more secure and to fully support Google's web apps.

These differences mean that IE 7, years after its release, is still not accepted on many corporate networks. There are many legacy intranet 'web apps' (IE 5 apps, really) that still don't work with it.

Microsoft has become trapped by its corporate installed base, and by the peculiar extensions they created to destroy Netscape.

That's wretched success.

IE 8 is supposed to be two browsers in one -- a "standards" browser and a legacy browser. Clearly Microsoft learned a lesson from IE 7.

Maybe IE 8 will work, and Microsoft will regain its monopoly power. They're certainly going to try with .NET and Silverlight to bind the browser back to the Microsoft ecosystem. At this critical moment in time, however, a very successful strategy has had an unanticipated cost.

Fermi's paradox is in the air

I've been a Fermi Paradox fanboy since a June 2000 Scientific American article roused my ire.

It's fun.

The essence of the puzzle is that while the galaxy is big, exponential growth and galactic time scales mean that critters like us ought to have filled it up by now.

I find it helpful to consider the ubiquity of bacteria ...

Gordon's Notes: Earth: the measure of all things

Bacteria: 10**-5 m
Human: 1 meter
Earth: 10**7 m - "mid" way between the Planck length and the universe.
Sun: 10**9 m
Milky way Galaxy: 10**21 m

So it takes at most 10**12 bacteria to stretch (directly) between any two points on the earth's surface.

Conversely it takes at most 10**14 earths to connect any two points in our galaxy.

So, within a an order of magnitude or two, a bacterium is to the earth as the earth is to the galaxy.

Over a mere 1-2 billion years bacteria have saturated the earth; common species are found everywhere. So how come the galaxy doesn't crawl with exponentially expanding aliens?

There have been lots of great theories, I won't review them here (see my old web page for examples). The most widely held explanation is that there is a Creator/Designer and She Wants Us Alone. This is more or less what you'll hear from most of the world's theists and from the Matrix crowd.

I prefer some other theories, though I do take the 'by design' answer seriously. Recently Charles Stross, who's explored the paradox in many of his science fiction novels and short stories, wrote a particularly strong summary of recent discussions ...

Charlie's Diary: The Fermi Paradox revisited; random dispatches from the front line

The Fermi Paradox [is]...  ...a fascinating philosophical conundrum — and an important one: because it raises questions such as "how common are technological civilizations" and "how long do they survive", and that latter one strikes too close to home for comfort. (Hint: we live in a technological civilization, so its life expectancy is a matter that should be of pressing personal interest to us.)

Anyway, here are a couple of interesting papers on the subject, to whet your appetite for the 21st century rationalist version of those old-time mediaeval arguments about angels, pin-heads, and the fire limit for the dance hall built thereon:

First off the block is Nick Bostrom, with a paper in MIT Technology Review titled Where are they? in which he expounds Robin Hanson's idea of the Great Filter:

The evolutionary path to life-forms capable of space colonization leads through a "Great Filter," which can be thought of as a probability barrier... The Great Filter must therefore be sufficiently powerful--which is to say, passing the critical points must be sufficiently improbable--that even with many billions of rolls of the dice, one ends up with nothing: no aliens, no spacecraft, no signals...
The nature of the Great Filter is somewhat important. If it exists at all, there are two possibilities; it could lie in our past, or in our future. If it's in our past, if it's something like (for example) the evolution of multicellular life — that is, if unicellular organisms are ubiquitous but the leap to multicellularity is vanishingly rare — then we're past it, and it doesn't directly threaten us. But if the Great Filter lies between the development of language and tool using creatures and the development of interstellar communication technology, then conceivably we're charging head-first forwards a cliff: we're going to run into it, and then ... we won't be around to worry any more.

But the Great Filter argument isn't the only answer to the Fermi Paradox. More recently, Milan M. Ćirković has written a paper, Against the Empire ... an alternative "successful" model for a posthuman civilization exists in the form of the stable but non-expansive "city-state". Ćirković explores the implications of non-empire advanced civilizations for the Fermi paradox and proposes that such localized civilizations would actually be very difficult to detect with the tools at our disposal, and may be much more likely than aggressively expansionist civilizations.

Finally, for some extra fun, here's John Smart pinning a singularitarian twist on the donkey's tail with his paper Answering the Fermi Paradox: Exploring the Mechanisms of Universal Transcension:

I propose that humanity's descendants will not be colonizing outer space. As a careful look at cosmic history demonstrates, complex systems rapidly transition to inner space, and apparently soon thereafter to universal transcension...

A very nice summary, even it doesn't add anything novel.

My "SETI Fail" page independently reinvented the singularitarian Great Filter, but I soon learned my thought was far from novel. Among others the ubiquitous Mr. Smart told me he'd come up with this resolution in 1972!

Another explanation, btw, is that established powers, fearing rivals, routinely wipe out any civilization foolish enough to advertise itself. Few find this explanation persuasive, but it's pertinent to my next tangent.

Assume one were a cautious high tech entity that had survived the Great Filter in some far away galaxy. You have lots of power available, but you fear sending a signal a galactic neighbor could capture. Better, perhaps, to send a generous one-way message to another galaxy. The distances are so vast, and light is so slow, that there's no possibility of unwanted extra-galactic visitors. Communication between galaxies is a message to the far future, and thus "safe".

So I wondered, this morning, how one would send such a signal.

Slashdot | ET Will Phone Home Using Neutrinos, Not Photons"Neutrinos are better than photons for communicating across the galaxy.

... That's the conclusion of a group of US astronomers who say that the galaxy is filled with photons that make communications channels noisy whereas neutrino comms would be relatively noise free. Photons are also easily scattered and the centre of the galaxy blocks them entirely. That means any civilisation advanced enough to have started to colonise the galaxy would have to rely on neutrino communications. And the astronomers reckon that the next generation of neutrino detectors should be sensitive enough to pick up ET's chatter...

So now we need only look for extra-galactic neutrino messages ...

Tuesday, May 20, 2008

Whatever happened to quantum dot solar energy technology?

Sometimes, when I search for posts, I run across forgotten stories I was once excited about.

For example, I was once pretty impressed by this 2005 report of high efficiency quantum dot solar energy technology ...

Gordon's Notes: The big event of 2005: nanotech solar energy conversion?

..CTV.ca | New plastic can better convert solar energy

TORONTO — Researchers at the University of Toronto have invented an infrared-sensitive material that's five times more efficient at turning the sun's power into electrical energy than current methods...

Sargent and other researchers combined specially-designed minute particles called quantum dots, three to four nanometres across, with a polymer to make a plastic that can detect energy in the infrared....

...Sargent said the new plastic composite is, in layman's terms, a layer of film that "catches'' solar energy. He said the film can be applied to any device, much like paint is coated on a wall...
"We've done it to make a device which actually harnesses the power in the room in the infrared.''
The film can convert up to 30 per cent of the sun's power into usable, electrical energy. Today's best plastic solar cells capture only about six per cent.

...Sargent's work was published in the online edition of Nature Materials on Sunday and will appear in its February issue.

Given today's rising oil prices, I assume my excitement was premature.

So what's happened since 2005?

The Sargent group web site now says:

... This first report did not achieve a high efficiency in the infrared. We are working to realize record photovoltaic efficiencies in the infrared to bring performance to what is needed to become commercially relevant...

In other words, the initial press release was a bit ... misleading.

Sigh.

I have no problem spotting exaggeration in healthcare related articles, but unsurprisingly I don't do quite as well in other areas. I should have looked for more sophisticated secondary discussions rather than working from the CTV article.

DeLong: Bill Moyer interviews Philippe Sands

Today Bloglines tossed up 112 DeLong posts.

It does that sometimes. The Analytic Engine gets sand in the gears then suddenly lurches onward.

I knew DeLong couldn't have been that quiet.

Among the posts is an extended excerpt from an interview Bill Moyers did with Philippe Sands. Mr. Sands is a scholar of modern torture who's studied the impact of British torture on the IRA. He believes that the torture strengthened the IRA, and prolonged the conflict, by increasing support from otherwise ambivalent Irish Catholics. Whatever intelligence was gained was outweighed by the damage to Britain's reputation.

Of course this is a pragmatic argument. It is also wrong to cause harm and pain, and while some violence may be lesser of two wrongs (invasion of Afghanistan) that has not been so of cruelty. The distinction is perhaps comparable to the difference between killing in self-defense and calculated murder.

Lastly, it is important to again recall that humans do slide down slipper slopes very easily. It is in our nature. We have "commandments" and their like for a reason. Legal cruelty is so very, very, dangerous ...

Yes, it is hard to continue to read, and write, about the Bush/Cheney/Rumsfeld/Rice/Feith/GOP torture regime. On the other hand, as civil duties go, this one's relatively easy sledding. So be a citizen and at least scan the interview.

By way of background, here's the book blurb on Amazon

On December 2, 2002 the U.S. Secretary of Defense, Donald Rumsfeld, signed his name at the bottom of a document that listed eighteen techniques of interrogation--techniques that defied international definitions of torture. The Rumsfeld Memo authorized the controversial interrogation practices that later migrated to Guantanamo, Afghanistan, Abu Ghraib and elsewhere, as part of the policy of extraordinary rendition. From a behind-the-scenes vantage point, Phillipe Sands investigates how the Rumsfeld Memo set the stage for a divergence from the Geneva Convention and the Torture Convention and holds the individual gatekeepers in the Bush administration accountable for their failure to safeguard international law.

The Torture Team delves deep into the Bush administration to reveal:

  • How the policy of abuse originated with Donald Rumsfeld, Dick Cheney and George W. Bush, and was promoted by their most senior lawyers
  • Personal accounts, through interview, of those most closely involved in the decisions
  • How the Joint Chiefs and normal military decision-making processes were circumvented
  • How Fox TV’s 24 contributed to torture planning
  • How interrogation techniques were approved for use
  • How the new techniques were used on Mohammed Al Qahtani, alleged to be “the 20th highjacker”
  • How the senior lawyers who crafted the policy of abuse exposed themselves to the risk of war crimes charges

and from the interview (editing, links, paragraph insertions, emphasis mine)...

Grasping Reality with Both Hands: The Semi-Daily Journal Economist Brad DeLong

...BILL MOYERS: You subtitle the book Rumsfeld's Memo and the Betrayal of American Values. Tell me briefly about that memo and why it betrayed American values.

PHILIPPE SANDS: The memo appears to be the very first time that the upper echelons of the military or the administration have abandoned President Lincoln's famous disposition of 1863: the U.S. military doesn't do cruelty.... It's called the U.S. Army Field Manual, and it's the bible for the military. And the military, of course, has fallen into error, and have been previous examples of abuse.... But apparently, what hasn't happened before is the abandonment of the rules against cruelty. And the Geneva Conventions were set aside, as Doug Feith, told me, precisely in order to clear the slate and allow aggressive interrogation... at the insistence of Doug Feith and a small group, including some lawyers. And the memo by Donald Rumsfeld then came in December, 2002, after they had identified Muhammed al-Qahtani. But it was permitted to occupy the space that had been created by clearing away the brush work of the Geneva Conventions. And by removing Geneva, that memo became possible.

Why does it abandon American values? It abandons American values because this military in this country has a very fine tradition, as we've been discussing, of not doing cruelty. It's a proud tradition, and it's a tradition born on issues of principle, but also pragmatism. No country is more exposed internationally than the United States.

I've listened, for example, to Justice Antonin Scalia saying, if the president wants to authorize torture, there's nothing in our constitution which stops it. Now, pause for a moment. That is such a foolish thing to say. If the United States president can do that, then why can't the Iranian president do that, or the British prime minister do that, or the Egyptian president do that? You open the door in that way, to all sorts of abuses, and you expose the American military to real dangers, which is why the backlash began with the U.S. Military.... It slipped into a culture of cruelty. There was a, it was put very pithily for me by a clinical psychologist, Mike Gellers, who is with the Naval Criminal Investigation Service, spending time down at Guantanamo, who described to me how once you open the door to a little bit of cruelty, people will believe that more cruelty is a good thing. And once the dogs are unleashed, it's impossible to put them back on. And that's the basis for the belief amongst a lot of people in the military that the interrogation techniques basically slipped from Guantanamo to Iraq, and to Abu Ghraib. And that's why, that's why the administration has to resist the argument and the claim that this came from the top.... It started with a few bad eggs. The administration has talked about a few bad eggs. I don't think the bad eggs are at the bottom. I think the bad eggs are at the top. And what they did was open a door which allowed the migration of abuse, of cruelty and torture to other parts of the world in ways that I think the United States will be struggling to contain for many years to come.

We have a long road of recovery ahead -- if we take it. Electing John McCain, who's abandoned his former opposition to torture, means we take the slippery road instead.

Monday, May 19, 2008

Quicken, Palm, AOL - once they were good

I can't remember when we first got an Intuit Quicken credit card. It might have been in the 80s, when Intuit emailed us a diskette every month.

I think it was a 3.5" diskette, but I know my first copy of Quicken shipped on a 5.25" floppy.

In those days, except for an unfortunate tendency to corrupt its database, Quicken was a pretty good product - on Windows and Mac alike.

It was never quite as good again. Over the past few years we've weaned ourselves off an increasingly flaky product, even as Quicken lost its transaction network. We're back on spreadsheets now, but we've kept our Quicken VISA card.

Until now.

Intuit has decided to switch banks, and the process is a bleedin' mess. Our VISA number will change (thank heavens I use AMEX for all my net transactions -- they're a class act), and when I went to pay my online bill I came across this message:
If your account was recently converted to a Citi card, you will need to access citicards.com to continue paying bills and viewing statements...

If you were recently converted to another Citi card, access www.citicards.com to register your new account. You will need to re-enroll in Paperless Statements and re-register to make Online Payments.

This website will not be available after June 26th, 2008.

Important Notice: As of May 18th , Paperless Statements will no longer be available. Instead, your statement will be sent to you via first-class mail...
Sounds like a messy divorce.

Palm, Quicken, AOL, Lotus, WordPerfect, Borland, Symantec, Norton, Ashton-Tate. They were all good in their day (pre-internet Mac-only AOL wasn't all bad!)

Those days are gone. Good-bye Quicken.

PS. We're looking for a non-Quicken VISA card. We don't pay interest so we don't care about interest rates. We want service, reliabity, security, a high quality web site, and minimal to no yearly fee.

Recommendations anyone?

Update 8/14/08: We ended up getting an REI VISA card through US Bank. The "signature" card has a seemingly good cash-back program, the usual warrantee protection (though we much prefer AMEX for that), and very good electronic information transfer and Quicken support. So it's in every way better than our old Quicken Visa. We also like REI and the card gives a larger discount there. They do, however, follow the evil practice of many banks -- the due date moves 2 days forward every month. So it's easy to miss the payment. Scum. AMEX sticks to the same day each month. I love my AMEX Blue Cash Back card.

Scary thought - I actually understand this Udell post

I've been doing this stuff too long. This Udell dialog on sparse database representation of social data actually makes sense to me ...

Semi-structured database records for social tagging « Jon Udell

... But when we stepped back and looked at the semi-structured data problem in a larger context, beyond the WinFS requirements, we saw the need to extend the top-level SQL type system in that way. Not just UDTs, but to have arbitrary extensibility...

JU: This is what the semantic web folks are interested in, right? Having attributes scattered through a sparse matrix?

QC: That’s right. And that leads to another thing which we call column groups, which allow you to clump a few of them together and say, that’s a thing, I’m going to put a moniker on that and treat it as an equivalence class in some dimension...

It's not a new discussion, this problem is as old as dirt. Think Lotus Agenda/Notes, etc. I'm sure there are variations on this theme from the pre-relational 1970s as well, and probably the 1960s.

Even today variations of ancient hierarchical databases (Mumps, Cache, Epic Healthcare, etc) are valued in part because of their approaches to the sparse data/flexible attribute problem. So are attribute-value data stores in relational tables.

It's interesting to see how it all connects though ...

Ada Lovelace's paralysis, quirks of In Our Time, and the odd beliefs of UK historians

The UK historians often featured on In Our Time are quite entertaining, but they have two common weaknesses.

One is a quite shaky knowledge of science and medicine. I suspect it's the fashion for certain UK academics to know nothing of the past 100 years of science, but it can be a bit annoying.

The other is a fondness for startling off-the-cuff remarks. For example, during a discussion of the peculiar persistence of the Galenic humoral theory [1] one scholar mentioned that medieval scholars couldn't do arithmetic -- so they weren't able to measure the futility of Humoral therapies. The introduction of Indian Maths enabled calculation, and finally ended one set of quackeries. (Though many others thrive today -- despite numeracy!)

I'm doing some expansion here. The original comment was about four words.

This weakness for cryptic but startling statements is somewhat endearing. Yes, the premise may be debatable, but it's interesting.

Today, I swear, I heard a guest proclaim that Ada Lovelace [1] was completely paralyzed for three years due to measles, but then recovered. This turns out be an example of both of the IOT historian weaknesses.

Measles can have some extremely nasty neurologic complications, but I don't recall reversible paralysis being among them (nor do my online references [2]). I also think it unlikely that she could have survived very long in early 19th century with complete paralysis.

The Ada Lovelace paralysis story turns out to be a bit of a mystery to the net. Quick searches found varying mentions of the degree of her paralysis, from legs alone to total paralysis. Wikipedia had the most suggestive explanation ...

Ada Lovelace - Wikipedia, the free encyclopedia

...In June 1829, she was paralyzed after a bout of the "measles". Lady Byron subjected the girl to continuous bed rest for nearly a year, which may have extended her period of disability. By 1831 she was able to walk with crutches...

So she was sick with something (measles, polio?), but her disability may have been the product of her eccentric/mentally ill mother's induced bed rest.

Now here's the interesting bit. You don't have to be a genius to figure out that the original story doesn't make sense. So how does it manage to survive in the minds of preeminent UK historians? Don't they ever get comments from their physician friends after public lectures?

[1] Link is to archive site. Google sends us to the iPlayer beat site which doesn't keep these episodes.

[2] Years ago MD Consult was a great source of medical references, but publisher fights tore it apart. Progress is not linear.

Sunday, May 18, 2008

An end to foolish pleas for more US engineers and scientists?

Ok, this should finally do it.

This should stop the nonsensical blubbering about how more Americans need to go into science, and more women should do computer science.

We already know that more Americans study science and engineering than makes economic sense -- probably because of immigration effects.

Now we learn that Japanese students are abandoning science and engineering. Why? More money for less work in other jobs.

Science and engineering are the comparative advantage of the "up and coming" nations like China, India, Korea, Thailand, Russia, etc. The only reason the US and Canada have maintained a strong presence in the sciences for so long is because our universities used to attract a large number of international students -- but Bush et al have crushed much of that appeal.

If we want even more US scientists and engineers, we're going to have to start taxing CEO salaries and use the money to buy engineering graduates new homes. Alternatively, we could elect Barack Obama and restore the attractiveness of American universities to the best international students.