Saturday, October 27, 2007

Huckabee and Gail Collins

The other day I actually tried to defend something Gail Collins of the NYT wrote, even comparing her to Molly Ivins.

Today DeLong persuaded me that I was making a mistake:
Grasping Reality with Both Hands: Brad DeLong's Semi-Daily Journal

...The bottom line is that a woman is dead... because Huckabee went along with crackpot anti-Clinton conspiracy nuts and released someone with a significant history of violence and sexual assault...
Collins completely missed the back story of Hucakbee's pardon of a man who went on to murder a woman. It wasn't some act of reasoned compassion, it was a bizarre side-effect of the right's insane hatred of Bill Clinton.

That is one heck of a blunder; one that would have passed without notice in the days before blogs.

I'm sorry Molly, I won't do that again.

The Wellstone memorial and the path not taken

The family was wandering the Iron Range, exiled by yet another "school release day", when we came upon the Wellstone memorial.

We hadn't known it existed. The memorial brought back bitter memories.

It's been five years since Minnesota Senator Paul Wellstone's plane crashed as he flew north to attend the funeral of a friend. The 2002 election was only a week away.

Wellstone, his wife, his daughter, several friends and two pilots died in the crash. The pilot and copilot had made several errors, it was later determined that they were not qualified to fly the plane.

In October of 2002 America's future was balanced on the edge of a precipice. From much of 2000 to 2002 the Senate had been evenly balanced, but before the election a rational Republican had defected and the Senate went Democrat.

Wellstone's death pushed America over the edge. Minnesota and national talk radio hosts used a impolitic memorial service and the Rove playbook to bring GOP Norm Coleman to power.

Minnesota traded a Senator famed for his integrity for one of the most craven politicians ever to spring from Rove's foundry. The GOP had a one seat majority in the Senate. From 2002 through 2006 the GOP controlled the Supreme Court, the House, the Senate and the Presidency.

Everything that was to follow sprang from that poisoned moment.

On May 1, 2003 the US led an invasion of Iraq and by 2004 the name Abu Ghraib had come to symbolize the worst of America. An American era that began with the abolition of slavery and peaked with the 1947 Marshall plan ended in government approved torture and ideologically mandated incompetence.

Which is why the Wellstone Memorial is a genuinely historic site. For the moment it's is not terribly well marked and is little known outside of the area. On Google's maps it's near here -- give or take a kilometer.

When we arrived, on a cold wet Sunday morning, we were alone. During a visit limited by our children's limited patience two other cars arrived. I think the site gets regular visitors.

From The Iron Range
(click on the pictures to see larger versions, you can download a full res image from the associated album)

It is a very beautiful site, one of the loveliest memorials I've seen anywhere. Certainly it has the sweetest air and finest scents of any I've known.



The Wellstone family and friends are memorialized by ancient stones ...







In 2006 another historic Senate election turned on the the survival of another midwestern Democratic senator, Tim Johnson of South Dakota. This time the balance tipped the other way, Johnson survived his ruptured aneurysm and the Democrats took the Senate.

Now we have no choice but to hope that the worst is over. We can't go back to the days before waterboarding became a national sport, but maybe we'll start a new path.

A new American enlightenment. There's not much choice, really.

Friday, October 26, 2007

The nuclear apocalypses that should have happened

I've read of most of these before, but DI has a complete collection of publicly known nuclear attack false alarms like this one:
Damn Interesting » The Apocalypses That Might Have Been

.... Unlike the previous alerts, this event wasn't an error in the early-detection system, this missile was confirmed as real. Fearing the worst, the Russian military prepared to launch a full-scale counterattack against the United States. Planes were readied, and missiles sat waiting to launch a nuclear volley on selected targets in the United States at a moment's notice. Tensions were running so high within the Russian leadership that Russian President Boris Yeltsin activated his nuclear briefcase, enabling him to communicate with his top military advisers and review the situation online. This was the first time he had ever done so. Amidst this uncertainty, as many fingers nervously hovered over death-bringing buttons, word was received from Soviet military observers: the missile, while real, was not en route to Russia. It was a harmless research rocket headed for space...
Basically if soldiers had followed their orders properly you wouldn't be reading this and I wouldn't be writing it. In the unlikely event we were alive today, we'd be foraging for food. There certainly wouldn't be an Internet.

I really don't understand why human civilization is still around. I share the opinion of Oppenheimer and the Manhattan Project physicists -- there's no evidence to suggest that humans are capable of living with nuclear weapons.

Still. We're here. Apparently.

Odd.

Rat plagues wipe out indigenous rats

We know that European human plagues depopulated most of the Americas, and allowed the Puritans to scavenge from dead Amerindians.

I've speculated that European dog diseases (Distemper?) wiped out the indigenous new world dog (true Indian dogs became extinct long ago).

So it's interesting to read that European rats wiped out native Island rats -- or rather, their infections did:

Damn Interesting » The Crabs of Christmas:

... the [Christmas Island] rats were identified as another endemic species, the Maclear's rat. It seems that in the late 19th century, these rats were as numerous as the red crabs are today. Like the crabs they were scavenging creatures that lived in burrows on the forest floor, but the exact role they played in the ecology of the island will forever remain a mystery– for by 1903 the species was extinct, wiped out by an epidemic of trypanosome parasites introduced by ship-borne black rats...
Europe's dense urban populations cooked up lethal bioweapons, carried around the world by European animals.

The US does NOT have a problem with science education

It may have a problem with science related employment however ....

The Science Education Myth (Business Week, Vivek Wadhwa)

Political leaders, tech executives, and academics often claim that the U.S. is falling behind in math and science education. They cite poor test results, declining international rankings, and decreasing enrollment in the hard sciences. They urge us to improve our education system and to graduate more engineers and scientists to keep pace with countries such as India and China.

Yet a new report by the Urban Institute, a nonpartisan think tank, tells a different story. The report disproves many confident pronouncements about the alleged weaknesses and failures of the U.S. education system. This data will certainly be examined by both sides in the debate over highly skilled workers and immigration (BusinessWeek.com, 10/10/07). The argument by Microsoft (MSFT), Google (GOOG), Intel (INTC), and others is that there are not enough tech workers in the U.S.

The authors of the report, the Urban Institute's Hal Salzman and Georgetown University professor Lindsay Lowell, show that math, science, and reading test scores at the primary and secondary level have increased over the past two decades, and U.S. students are now close to the top of international rankings. Perhaps just as surprising, the report finds that our education system actually produces more science and engineering graduates than the market demands....

... As far as our workforce is concerned, the new report showed that from 1985 to 2000 about 435,000 U.S. citizens and permanent residents a year graduated with bachelor's, master's, and doctoral degrees in science and engineering. Over the same period, there were about 150,000 jobs added annually to the science and engineering workforce. These numbers don't include those retiring or leaving a profession but do indicate the size of the available talent pool. It seems that nearly two-thirds of bachelor's graduates and about a third of master's graduates take jobs in fields other than science and engineering...

So the data suggests we actually graduate more scientists and engineers than we have jobs for. Encouraging science education isn't going to make more scientists, any more than encouraging farming education will make more American farmers.

There's better paid work for smart American students in other domains.

Which brings us back to the farming analogy. The US is a post-agricultural nation that dose agriculture as an expensive hobby. Are we a post-science nation too?

BTW, I'm so pleased someone has done the research on this. I love to have my intuitions confirmed ...

Thursday, October 25, 2007

On saving the world - Shtetl-Optimized

I've had a post bouncing around my head for a while. It's about saving the human world. (The rest of the world will do just fine - eventually. As my 8 yo says, history just keeps happening.)

I'm going to write that post - eventually. I'll try to write the main risks down (US-China conflicts, WMDs, cost of havoc, rapid environmental collapse and resulting socioeconomic disruptions, artificial minds [1], etc) and what a geek can do about them in the age of O'Reilly.

In the meantime, a post by Scott Aaronson (yes, two t, two a)...

Shtetl-Optimized » Blog Archive » Procrastinating on the sidelines of history

... So, Al Gore. Look, I don’t think it reflects any credit on him to have joined such distinguished pacifists as Henry Kissinger and Yasser Arafat. I think it reflects credit on the prize itself. This is one of the most inspired choices a Nobel Peace Prize committee ever made, even though ironically it has nothing directly to do with peace.

With the release of An Inconvenient Truth and The Assault on Reason, it’s become increasingly apparent that Gore is the tragic hero of our age: a Lisa among Cletuses, a Jeffersonian rationalist in the age of Coulter and O’Reilly. If I haven’t said so more often on this blog, it’s simply because the mention of Gore brings up such painful memories for me.

In the weeks leading up to the 2000 US election, I could almost feel the multiverse splitting into two branches of roughly equal amplitude that would never again interact. In both branches, our civilization would continue racing into an abyss, the difference being that in one branch we’d be tapping the brakes while in the other we’d be slamming the accelerator. I knew that the election would come down to Florida and one or two other swing states, that the margin in those states would be razor-thin (of course no one could’ve predicted how thin), and that, in contrast to every other election I’d lived through, in this one every horseshoe and butterfly would make a difference. I knew that if Bush got in, I’d carry a burden of guilt the rest of my life for not having done more to prevent it.

The question was, what could a 19-year-old grad student at Berkeley do with that knowledge? How could I round up tens of thousands of extra Gore votes, and thereby seize what might be my only chance in life to change the course of history? I quickly ruled out trying to convince Bush voters, assuming them beyond persuasion. (I later found out I was wrong, when I met people who’d voted for Bush in 2000 but said they now regretted their decision. To me, it was as if they’d just noticed the blueness of the sky.)...

...In the end, though, the Nadertrading movement simply failed to reach enough of its target audience. The websites put up by me and others apparently induced at least 1,400 Nader supporters in Florida to vote for Gore — but 97,000 Floridians still voted for Nader. And as we know, Bush ended up “winning” the state by 537 votes...

Ah yes. Nader. There are no words.

But. We're not alone. Not completely.

[1] Even I wince when I write that. It's just so geeky. Tough.

Wednesday, October 24, 2007

Fragments of history: Manchester air raid shelters in abandoned mine shafts

My mother was a girl in Manchester England during WW II. Today, after my usual stories of what the children were up to, she mentioned that as a child, early in WW II, she enjoyed the family time in the backyard bomb shelter. These were hand dug pits with concrete bottoms, metal siding and roofs, and dirt on top. It's not clear to me how much protection these things provided, but I imagine the psychological benefit was significant.

We spoke a bit of general WW II shelter design, and she mentioned that among the Manchester shelters were abandoned mine shafts. Coal mine shafts in particular. These were group shelters, a step below subways and the like.

It seems to me there ought to be some stories about taking shelter in an abandoned coal mine. Pleasant no, memorable yes. The search terms should be specific enough to find something.

Alas manchester "air raid" shelter coal "mine shaft" didn't come up with much today, though it did catch another story of a lost world.

In a day or two, of course, the search will find this post.

Sheila Cox says hello.

1940 was 67 years ago. Getting on half way to the American Civil War.

Imagine someone, or something, searching on this topic 70 years from now. It's rather hard for me to imagine what they might be like.

Financial market turmoil: The problem is obvious, the fix is hard

The current explanation of why the trillion dollar securities market is in turmoil, which may push us into recession, is that mortgage-backed security pricing was inaccurate because credit-rating agencies didn't do their job.

Gee, I wonder why they didn't do their job? It's a mystery. Needs lots of investigation.

Or not ....

Robert Reich's Blog: Why Credit-rating Agencies Blew It: Mystery Solved

Recently, Treasury Secretary Hank Paulson sharply criticized credit-rating agencies for failing to recognize the risks in hundreds of billions worth of mortgage-backed securities whose values continue to plummet as home-loan defaults grow.

... obvious why credit-rating agencies didn't blow the whistle. (They didn't blow the whistle on Enron on Worldcom before those entities collapsed, either.) You see, credit-rating agencies are paid by the same institutions that package and sell the securities the agencies are rating...

As I've noted in other contexts, it's not necessary for anyone at the credit-rating agencies to actually conspire to create false results. It's simply emergent behavior -- natural selection in action. Competent honest people will fail to make their incentives, so they'll either become incompetent, move to other businesses, or become dishonest. It probably only takes 1-2 years of this type of conflict of interest to create agencies that are a mixture of incompetent and dishonest.

So the interesting question is not why the credit-rating agencies are dishonest and incompetent. That's obvious.

The interesting question is why Congress never fixed this back in the Enron days. The division was proposed then, probably by Robert Reich but also by the usual suspects (Krugman, DeLong, etc).

Oh wait, we know the answer to that one too. Congress is incented by reelection and retirement jobs, which are paid for by the .. the credit-rating agencies and securities company ...

Hmm. So why does Congress get away with such incompetency and corruption ...

Oh wait, we know the answer to that one too.

The newspapers don't write about this and the talk shows and television news groups don't educate the public.

I'll leave the rest of the "Oh, waits" to the reader.

The problem is the American voter.

My answer to Jeff Atwood's question: Why Does Software Spoil?

Jeff Atwood writes Coding Horror, one of my favorite technical blogs. In two posts he asks familiar questions (remember, he's from the XP world): Coding Horror: Why Does Software Spoil? and "Are Features the Enemy".

I think the answer is pretty straightforward. The way we pay for software induces perverse incentives. Here's the comment I wrote to his blog:

It's the business model.

If we rented software then there'd be a steady revenue stream for developers, and a lesser feature incentive.

Since we buy software (sort of) developers have perverse incentives. They have to "break" the prior version by not maintaining it, introducing new file formats, and they have to add features to hide the fact that the updates are driven by vandalism, not value.

Of course software rental completely screws customers unless the file/data  formats are completely public and interoperability is assured. I'm sure customers will realize how important that is.

Thirty years from now.

Viruses and worms now help drive this model. Security issues make it much easier for companies that own both the OS and the software products to "break" older software, thereby forcing updates. Feature addition is simply a way to make customers accept the forced update. If Microsoft didn't add features, but forced Office updates by making old versions run less well, customers would scream.

These kinds of things don't really have to be planned. They're simply emergent. It's just the way natural selection operates on business.

Of course the only thing worse than the current business model is the combination of software leasing and proprietary data formats, including proprietary metadata models. (Ever try extracting and moving all your iPhoto data?)

Happily customers would never fall for that trick.

Oh, wait ...

Monday, October 22, 2007

Sibling Neandertal and the end of the modern human

I asked the other day "what was Homo sapiens doing for 115,000 years?"

I didn't expect John Hawks to answer me so quickly (emphases mine):
... we have undergone light-years of change since the last Neandertals lived. This is not a question of "modern human origins" anymore. We can now show that living people are much more different from early modern humans than any differences between Neandertals and other contemporary peoples. I think that "modern humans" is on its way to obsolescence. What matters is the pattern of change across all populations. Possibly that pattern was initiated by changes in one region but the subsequent changes were so vast that the beginning point hardly matters... [From John Hawks Anthropology Weblog : 2007 10]
Most of the popular books on human origins I've read emphasize how similar we are to Homo sapiens of 60,000 years ago. It appears that meme is on its last legs. We look a lot like our ancestors, but our minds are very different.

To answer my original question -- Home sapiens spent the last 115,000 years turning into an animal that could write. It wasn't easy.

It's tempting to suggest we need to name a new human "species" that was launched 15,000 years ago, but these arbitrary demarcations are increasingly unconvincing, just as misleading as the concept of "the modern human".

The more we learn about the evolution of the human mind the more fluid it seems. Our modern world has many more niches for exotic minds than the ancient world; if we live long enough we'll fill those niches.

One day we might need interlocutors -- even between speakers of the same language.

The rest of Hawks article is well worth a close read. He uses a recent discovery from the Neandertal genome to support a favorite thesis of his -- that we are part Neandertal.

The Bush/Cheney regime in a nutshell

To a first approximation, this summarizes the entire reign of the modern GOP...
Suicide Is Not Painless - Frank Rich - New York Times

...The inspector general also assured Congress that neither Donald Rumsfeld nor Paul Wolfowitz knew anything about the crimes. Senators on the Armed Services Committee were incredulous. John Warner, the Virginia Republican, could not believe that the Pentagon’s top two officials had no information about “the most significant defense procurement mismanagement in contemporary history.”

But the inspector general who vouched for their ignorance, Joseph Schmitz, was already heading for the exit when he delivered his redacted report. His new job would be as the chief operating officer of the Prince Group, Blackwater’s parent company....

Sunday, October 21, 2007

Hibbing and the Hull Rust

Hibbing is a northern MN town with a remarkable history (the 10/07 Wikipedia article doesn't do it justice).

If you're ever wandering Minnesota's Iron Range, I recommend a visit to the outskirts.

Yes, as seen on screen capture of a Google sat map the town of Hibbing (bottom right of the image) is dwarfed by what was once the world's largest open pit mine - the Hull Rust:
".... more than three miles long, two miles wide and 535 feet deep. This man-made "Grand Canyon of the North" was the first strip mine on the Mesabi Iron Range. The amazing view continues to grow as the Hibbing Taconite Company Mine expands its mining operations.

Since 1895 more than 1.4 billion tons of earth have been removed on its 2,000 acres of land, and more than 800 million gross tons of iron ore have been shipped from the mine. At peak production in the 1940's, as much as one quarter of the ore mined in the United States came from the Hull Rust Mine."
Hibbing was a boom town before WW I, with 60 saloons, about 16,000 people, an Opera House and a gorgeous Carnegie Library. That's about the time folks realized the town was sitting on vast amounts of iron. The town was demolished, it's not clear it every truly recovered. The Greyhound bus company started then, busing miners from their new, more distant homes.

There's a small park, little known to tourists, documenting a bit of the old town. The park itself is falling into history, it looks like a political gift that has since been neglected. There's a lot of that in the range.

I took a hand-held panorama picture from the Hull Rust lookout and stitched it together. It was a hack job with lots of artifacts and a curious triplication in one join, but it does capture something of the view from the lookout (click to see larger image, you can download the full res as well):

From The Iron Range
Update 10/27: Here are some more photos from our Range trip, see also: The Wellstone memorial and the path not taken.
The Iron Range

I am 113810027503326386174. And 578762461. At least.

Today I have been re-christened 113810027503326386174. It is the ID Google assigned to the persona associated with Gordon's Notes and other blogs. I assume it will be the foundation for Google's future identity management services.

If you do various straightforward manipulations based on the geometry of the Incan pyramids, you will extract the number 666.

That persona also has my primary GMail account, though at last count I had at least four GMail personas (associated with various Google App domains -- nothing fishy about it).

I will need to add this new number to the page where I park all my public and related personas.

Incidentally, I read in Slashdot today that Facebook, where I am 578762461, is moving to 64 bit identifiers. That should cover the first ten minutes of post-Singular identify explosion.

I thought of 113810027503326386174 and tatoos, but that's not something to joke about.

It will take me a while to memorize these new names. Mnemonic anyone?

Update 10/2/08: It finally occurred to me that this is my "number of the beast". It is, after all, clear where Google is going, it's a number, and it's mine. So now I can add the religion tag to this post. As of today there are 0 hits on "Google Profile" and "Number of the Beast". Maybe I can tweak that a bit ...

Update 4/25/09: Well, that was quick.

http://www.google.com/s2/sharing/stuff?user=113810027503326386174 no longer works! Instead the new URL is www.google.com/s2/profiles/113810027503326386174 which now redirects to http://www.google.com/profiles/jfaughnan.

Update 9/19/11: My G+ posts incorporate this ID

Saturday, October 20, 2007

iPhone Jan 2008 and, at last, the end of Palm

Apple doesn't make the mistake of thinking thinking the customer knows best. Mostly this works, but sometimes it leads to perverse obstinacy. iPhoto won't import image Libraries. Aperture can't edit date metadata. Apple can be incredibly obtuse.

So it was plausible, though it bordered on the crazed, that Apple intended to own the iPhone completely, and to reserve all software production to Apple.

It felt even more plausible when Apple talked about AJAX apps like they were a credible solution, tightened its ringtone control, went to war with Apple geeks, and even Nokia taunted the lion.

Plausible enough, that fear of Apple's choices meant I couldn't get an iPhone until Apple met my personal requirements. I've been sitting on my wallet, even contemplating a BlackBerry. Yes, even contemplating another year with a Motorola RAZR, a replacement Palm Tungsten, and an iPod.

Then, last week, Apple promised a true iPhone/iTouch SDK in February 2008 and Pogue wrote:
... Here's my view of the timeline: Leopard ships October 26th. Apple announces a new iPhone model or models at Macworld Expo on January 15th. The models ship along with an updated OS that's more fully Leopard for iPhone as a software update by early February. The iPhone SDK appears shortly thereafter...
It seems that Apple has chosen the iTunes signed application distribution model, with Apple taking a percentage of every sale. We don't know if they go to a software subscription model.

I'm good with that. If they'd announced this a month ago I'd have an iPhone now.

So was this the plan all along, or did Jobs change his mind? I suspect it was more or less the plan, but if there'd been less screaming Jobs might have tried to own the whole thing.

So why the long delay between product release and SDK announcement? Probably the 10.5 delay, which arose at least partly from the decision to take the iPhone to market. Key people were pulled from 10.5 to the iPhone, then pulled from the iPhone back to 10.5. The last step meant there was no time to do an SDK and improve the iPhone APIs, and perhaps Jobs didn't want to announce SDK plans six months ahead of time.

They also needed to get everything working flawlessly with 10.5, which was hard to do before 10.5 went public.

We're close enough to Jan 15th I'll wait to see if Pogue's right, but clearly this is great news for Apple geeks.

Oh, and what does this have to do with Palm?

As long as it seemed possible that Apple was going to keep the iPhone a strictly entertainment-oriented device, Palm still had a ray of hope. That light just went out. Assuming Apple doesn't screw up, and I don't think they will, a large developer community will fill any Palm functionality that Apple doesn't provide.

Palm, at last, is finished.

Update: Daring Fireball has a detailed discussion of a hypothetical but plausible signed distribution model and how it would work for developers.

Update 10/21/07: Glenn Fleishman of Tidbits covered this topic with more nuanced detail. Nokia's "Symbian Signed Application" program might be Apple's model. Glenn also mentions something I forgot, that Leopard's signed application model is probably a prerequisite for the the iPhone's application distribution mechanism. No mention of what this has to do with current iPhone applications "running as root", but it sure feels like the iPhone was originally designed to run 10.5, and that the shipping version was put together in a hell of rush.

It's a miracle the 1.0 iPhone works as well as it does. One day someone will write a heck of a geek book about the iPhone's creation. I'm imagining it involved flogging and illegal stimulants.

Update 10/21/07: Michael Tsai says we shouldn't be so trusting, the word "SDK" need not mean the ability to deliver the class of products Apple delivers. So if we don't hear more details with iPhone 2.0 in January, waiting for February would be a good idea.

Fallows says No to Mukasey

If Mukasey is rejected Bush will use some in-place stooge to run the department through 2008. No matter, Fallows is right:
James Fallows (October 19, 2007) - Mukasey: No

....A specific point: the 'waterboarding' outrage. As is now becoming famous, Mukasey said this, when asked by Sen. Sheldon Whitehouse whether waterboarding was constitutional: “I don’t know what is involved in the technique,” Mr. Mukasey replied. “If waterboarding is torture, torture is not constitutional.”

Either way you slice it, this answer alone is grounds for rejecting Mukasey. If he really doesn't 'know what is involved' in the technique, he is unacceptably lazy or ill-informed. Any citizen can learn about this technique with a few minutes on the computer.* Any nominee for Attorney General in 2007 who has not taken the time to inform himself fits the pattern of ignorant incuriosity we can no longer afford at the highest levels.