Wednesday, October 24, 2007

My answer to Jeff Atwood's question: Why Does Software Spoil?

Jeff Atwood writes Coding Horror, one of my favorite technical blogs. In two posts he asks familiar questions (remember, he's from the XP world): Coding Horror: Why Does Software Spoil? and "Are Features the Enemy".

I think the answer is pretty straightforward. The way we pay for software induces perverse incentives. Here's the comment I wrote to his blog:

It's the business model.

If we rented software then there'd be a steady revenue stream for developers, and a lesser feature incentive.

Since we buy software (sort of) developers have perverse incentives. They have to "break" the prior version by not maintaining it, introducing new file formats, and they have to add features to hide the fact that the updates are driven by vandalism, not value.

Of course software rental completely screws customers unless the file/data  formats are completely public and interoperability is assured. I'm sure customers will realize how important that is.

Thirty years from now.

Viruses and worms now help drive this model. Security issues make it much easier for companies that own both the OS and the software products to "break" older software, thereby forcing updates. Feature addition is simply a way to make customers accept the forced update. If Microsoft didn't add features, but forced Office updates by making old versions run less well, customers would scream.

These kinds of things don't really have to be planned. They're simply emergent. It's just the way natural selection operates on business.

Of course the only thing worse than the current business model is the combination of software leasing and proprietary data formats, including proprietary metadata models. (Ever try extracting and moving all your iPhoto data?)

Happily customers would never fall for that trick.

Oh, wait ...

Monday, October 22, 2007

Sibling Neandertal and the end of the modern human

I asked the other day "what was Homo sapiens doing for 115,000 years?"

I didn't expect John Hawks to answer me so quickly (emphases mine):
... we have undergone light-years of change since the last Neandertals lived. This is not a question of "modern human origins" anymore. We can now show that living people are much more different from early modern humans than any differences between Neandertals and other contemporary peoples. I think that "modern humans" is on its way to obsolescence. What matters is the pattern of change across all populations. Possibly that pattern was initiated by changes in one region but the subsequent changes were so vast that the beginning point hardly matters... [From John Hawks Anthropology Weblog : 2007 10]
Most of the popular books on human origins I've read emphasize how similar we are to Homo sapiens of 60,000 years ago. It appears that meme is on its last legs. We look a lot like our ancestors, but our minds are very different.

To answer my original question -- Home sapiens spent the last 115,000 years turning into an animal that could write. It wasn't easy.

It's tempting to suggest we need to name a new human "species" that was launched 15,000 years ago, but these arbitrary demarcations are increasingly unconvincing, just as misleading as the concept of "the modern human".

The more we learn about the evolution of the human mind the more fluid it seems. Our modern world has many more niches for exotic minds than the ancient world; if we live long enough we'll fill those niches.

One day we might need interlocutors -- even between speakers of the same language.

The rest of Hawks article is well worth a close read. He uses a recent discovery from the Neandertal genome to support a favorite thesis of his -- that we are part Neandertal.

The Bush/Cheney regime in a nutshell

To a first approximation, this summarizes the entire reign of the modern GOP...
Suicide Is Not Painless - Frank Rich - New York Times

...The inspector general also assured Congress that neither Donald Rumsfeld nor Paul Wolfowitz knew anything about the crimes. Senators on the Armed Services Committee were incredulous. John Warner, the Virginia Republican, could not believe that the Pentagon’s top two officials had no information about “the most significant defense procurement mismanagement in contemporary history.”

But the inspector general who vouched for their ignorance, Joseph Schmitz, was already heading for the exit when he delivered his redacted report. His new job would be as the chief operating officer of the Prince Group, Blackwater’s parent company....

Sunday, October 21, 2007

Hibbing and the Hull Rust

Hibbing is a northern MN town with a remarkable history (the 10/07 Wikipedia article doesn't do it justice).

If you're ever wandering Minnesota's Iron Range, I recommend a visit to the outskirts.

Yes, as seen on screen capture of a Google sat map the town of Hibbing (bottom right of the image) is dwarfed by what was once the world's largest open pit mine - the Hull Rust:
".... more than three miles long, two miles wide and 535 feet deep. This man-made "Grand Canyon of the North" was the first strip mine on the Mesabi Iron Range. The amazing view continues to grow as the Hibbing Taconite Company Mine expands its mining operations.

Since 1895 more than 1.4 billion tons of earth have been removed on its 2,000 acres of land, and more than 800 million gross tons of iron ore have been shipped from the mine. At peak production in the 1940's, as much as one quarter of the ore mined in the United States came from the Hull Rust Mine."
Hibbing was a boom town before WW I, with 60 saloons, about 16,000 people, an Opera House and a gorgeous Carnegie Library. That's about the time folks realized the town was sitting on vast amounts of iron. The town was demolished, it's not clear it every truly recovered. The Greyhound bus company started then, busing miners from their new, more distant homes.

There's a small park, little known to tourists, documenting a bit of the old town. The park itself is falling into history, it looks like a political gift that has since been neglected. There's a lot of that in the range.

I took a hand-held panorama picture from the Hull Rust lookout and stitched it together. It was a hack job with lots of artifacts and a curious triplication in one join, but it does capture something of the view from the lookout (click to see larger image, you can download the full res as well):

From The Iron Range
Update 10/27: Here are some more photos from our Range trip, see also: The Wellstone memorial and the path not taken.
The Iron Range

I am 113810027503326386174. And 578762461. At least.

Today I have been re-christened 113810027503326386174. It is the ID Google assigned to the persona associated with Gordon's Notes and other blogs. I assume it will be the foundation for Google's future identity management services.

If you do various straightforward manipulations based on the geometry of the Incan pyramids, you will extract the number 666.

That persona also has my primary GMail account, though at last count I had at least four GMail personas (associated with various Google App domains -- nothing fishy about it).

I will need to add this new number to the page where I park all my public and related personas.

Incidentally, I read in Slashdot today that Facebook, where I am 578762461, is moving to 64 bit identifiers. That should cover the first ten minutes of post-Singular identify explosion.

I thought of 113810027503326386174 and tatoos, but that's not something to joke about.

It will take me a while to memorize these new names. Mnemonic anyone?

Update 10/2/08: It finally occurred to me that this is my "number of the beast". It is, after all, clear where Google is going, it's a number, and it's mine. So now I can add the religion tag to this post. As of today there are 0 hits on "Google Profile" and "Number of the Beast". Maybe I can tweak that a bit ...

Update 4/25/09: Well, that was quick.

http://www.google.com/s2/sharing/stuff?user=113810027503326386174 no longer works! Instead the new URL is www.google.com/s2/profiles/113810027503326386174 which now redirects to http://www.google.com/profiles/jfaughnan.

Update 9/19/11: My G+ posts incorporate this ID

Saturday, October 20, 2007

iPhone Jan 2008 and, at last, the end of Palm

Apple doesn't make the mistake of thinking thinking the customer knows best. Mostly this works, but sometimes it leads to perverse obstinacy. iPhoto won't import image Libraries. Aperture can't edit date metadata. Apple can be incredibly obtuse.

So it was plausible, though it bordered on the crazed, that Apple intended to own the iPhone completely, and to reserve all software production to Apple.

It felt even more plausible when Apple talked about AJAX apps like they were a credible solution, tightened its ringtone control, went to war with Apple geeks, and even Nokia taunted the lion.

Plausible enough, that fear of Apple's choices meant I couldn't get an iPhone until Apple met my personal requirements. I've been sitting on my wallet, even contemplating a BlackBerry. Yes, even contemplating another year with a Motorola RAZR, a replacement Palm Tungsten, and an iPod.

Then, last week, Apple promised a true iPhone/iTouch SDK in February 2008 and Pogue wrote:
... Here's my view of the timeline: Leopard ships October 26th. Apple announces a new iPhone model or models at Macworld Expo on January 15th. The models ship along with an updated OS that's more fully Leopard for iPhone as a software update by early February. The iPhone SDK appears shortly thereafter...
It seems that Apple has chosen the iTunes signed application distribution model, with Apple taking a percentage of every sale. We don't know if they go to a software subscription model.

I'm good with that. If they'd announced this a month ago I'd have an iPhone now.

So was this the plan all along, or did Jobs change his mind? I suspect it was more or less the plan, but if there'd been less screaming Jobs might have tried to own the whole thing.

So why the long delay between product release and SDK announcement? Probably the 10.5 delay, which arose at least partly from the decision to take the iPhone to market. Key people were pulled from 10.5 to the iPhone, then pulled from the iPhone back to 10.5. The last step meant there was no time to do an SDK and improve the iPhone APIs, and perhaps Jobs didn't want to announce SDK plans six months ahead of time.

They also needed to get everything working flawlessly with 10.5, which was hard to do before 10.5 went public.

We're close enough to Jan 15th I'll wait to see if Pogue's right, but clearly this is great news for Apple geeks.

Oh, and what does this have to do with Palm?

As long as it seemed possible that Apple was going to keep the iPhone a strictly entertainment-oriented device, Palm still had a ray of hope. That light just went out. Assuming Apple doesn't screw up, and I don't think they will, a large developer community will fill any Palm functionality that Apple doesn't provide.

Palm, at last, is finished.

Update: Daring Fireball has a detailed discussion of a hypothetical but plausible signed distribution model and how it would work for developers.

Update 10/21/07: Glenn Fleishman of Tidbits covered this topic with more nuanced detail. Nokia's "Symbian Signed Application" program might be Apple's model. Glenn also mentions something I forgot, that Leopard's signed application model is probably a prerequisite for the the iPhone's application distribution mechanism. No mention of what this has to do with current iPhone applications "running as root", but it sure feels like the iPhone was originally designed to run 10.5, and that the shipping version was put together in a hell of rush.

It's a miracle the 1.0 iPhone works as well as it does. One day someone will write a heck of a geek book about the iPhone's creation. I'm imagining it involved flogging and illegal stimulants.

Update 10/21/07: Michael Tsai says we shouldn't be so trusting, the word "SDK" need not mean the ability to deliver the class of products Apple delivers. So if we don't hear more details with iPhone 2.0 in January, waiting for February would be a good idea.

Fallows says No to Mukasey

If Mukasey is rejected Bush will use some in-place stooge to run the department through 2008. No matter, Fallows is right:
James Fallows (October 19, 2007) - Mukasey: No

....A specific point: the 'waterboarding' outrage. As is now becoming famous, Mukasey said this, when asked by Sen. Sheldon Whitehouse whether waterboarding was constitutional: “I don’t know what is involved in the technique,” Mr. Mukasey replied. “If waterboarding is torture, torture is not constitutional.”

Either way you slice it, this answer alone is grounds for rejecting Mukasey. If he really doesn't 'know what is involved' in the technique, he is unacceptably lazy or ill-informed. Any citizen can learn about this technique with a few minutes on the computer.* Any nominee for Attorney General in 2007 who has not taken the time to inform himself fits the pattern of ignorant incuriosity we can no longer afford at the highest levels.

The cult of IQ

It occurred to me that Greens and Grays and IQ is a good excuse to say something about the cult of IQ (that is, whatever IQ tests measure).

Very quickly (because the kids are getting restless):
  1. Whatever IQ score means, it's only loosely correlated with most measures of "success" - at least in our world. It certainly doesn't correlate with number of genetic descendants, but it doesn't correlate with leadership success or even wealth either. There are lots of poor and/or unhappy members of Mensa, and lots of very successful entrepreneurs with unremarkable IQ scores.
  2. Whatever value IQ might have in today's world, it will probably have about as much value in 30 years as muscle has had since the steam engine was fully implemented. The strong become weak, the weak become strong; it's the selfish justification for "compassion".
  3. There's not much evidence that IQ correlates with either insight or judgment. I suspect one day we'll figure out they have pretty different physiology, evolutionary history, and adaptive advantages. I don't think George Bush's problem is that he's dumb, his old SAT scores tell us that, at least as a teen, he had a quite decent IQ.
IQ is useful, but generally overrated.

Thursday, October 18, 2007

So what was Homo Sapiens DOING for 115,000 years?

After humans were hunters, but before they were farmers, they learned to fish ...
ASU team detects earliest modern humans | ASU News

After decades of debate, paleoanthropologists now agree the genetic and fossil evidence suggests that the modern human species – Homo sapiens – evolved in Africa between 100,000 and 200,000 years ago...

... “Generally speaking, coastal areas were of no use to early humans – unless they knew how to use the sea as a food source” says Marean. “For millions of years, our earliest hunter-gatherer relatives only ate terrestrial plants and animals. Shellfish was one of the last additions to the human diet before domesticated plants and animals were introduced.”

Before, the earliest evidence for human use of marine resources and coastal habitats was dated about 125,000 years ago. “Our research shows that humans started doing this at least 40,000 years earlier. This could have very well been a response to the extreme environmental conditions they were experiencing,” he says.

“We also found what archaeologists call bladelets – little blades less than 10 millimeters in width, about the size of your little finger,” Marean says. “These could be attached to the end of a stick to form a point for a spear, or lined up like barbs on a dart – which shows they were already using complex compound tools. And, we found evidence that they were using pigments, especially red ochre, in ways that we believe were symbolic,” he describes.

Archaeologists view symbolic behavior as one of the clues that modern language may have been present. The earliest bladelet technology was previously dated to 70,000 years ago, near the end of the Middle Stone Age, and the modified pigments are the earliest securely dated and published evidence for pigment use.

“Coastlines generally make great migration routes,” Marean says. “Knowing how to exploit the sea for food meant these early humans could now use coastlines as productive home ranges and move long distances.”..
From the press release alone it seems the significant observations were that early Homo Sapiens may have evolved by the ocean. Hard to know if that explains why we are, for a primate, terrific swimmers [1]. The study also moves a key cognitive task, bladelet creation, back another 60,000 years.

So if humans could manufacture bladelets 125, 000 years ago, what the heck were they doing for 115,000 years prior to conquest of the planet? That's a heck of a long time in the context of human evolution

We have a lot in common with those early Homo sapiens, but I suspect our minds are pretty different.

Update 10/20/07: I remembered this was called the "aquatic ape theory". It may have been popular in the 1970s. It's suffered from some eccentric proponents over the years.

Greens and Grays and IQ

James D. Watson appears to be a member of the Bell Curve club. He's also very old, and I suspect his own IQ is nowhere near where it once was.

Whatever the cause of Watson's opinion, the topic has lead to the usual questions about the genetics of "whatever it is that IQ tests test". I read the NYT response as relatively cautious about the influence of post-natal environment on IQ. It could be read as acknowledging that IQ is largely determined by genes and the intrauterine environment, with very little other environmental influence. I think that is roughly the current scientific consensus.

I've written about this before; it's a fascinating if unsettling topic. Ashkenazi Jews and South Koreans seem to be unusually good at clever things, and for the former there's even some suggestive genes to inspect.

But what of it?

Let us assume the human race was divided into Greens and Grays, and that Greens scored 20 points higher on IQ tests than the Grays. This would translate into lots of Green wealth and power.

What would the Greens then owe the Grays? What do the strong owe the less strong? That, to me, is the more important question.

I, of course, am a good commie. From each according to their ability, to each according to their need. Adjusted for human limitations of course.

There are no American professional hockey players

Or so I might think, judging from the  U.S. Hockey Hall of Fame Museum Inductees. I gather from the absence of the inevitable inductees like Rocket Richard and Wayne Gretzky that only US born players can join up. It's a paltry list, and I don't recognize any of them from the years I followed hockey.

I think they need to bend the rules a bit. Wayne married a American after all, and for all I know he's naturalized by now. Heck, what about Jacques Lemaire, now residing in my hometown. Surely Jacques must have a green card ...

In Our Time: Opium Wars and the 2008 Olympics, Spinoza's radical determinism and the new feed page

Melvyn Bragg's BBC show, In Our Time, has begun a new season. I'm a fan.

The bad news is that the BBC is sticking with its execrable latest-episode-only download policy. So if you want to listen to the superb Opium War episode on your MP3 player you need to either use Audio Hijack Pro to capture the RealAudio stream or (if you know me) ask me for a DVD with the entire series [1]. Incidentally, this is a good time to write a quick email to set IOT free.

The good news is there's a new page that makes it easy to subscribe to a feed. I used to subscribe via iTunes, but if I went a week without using iTunes I missed the show. Now I subscribe via iTunes and Bloglines; I use Bloglines at least daily so it's easy for me to save the MP3 and email it to myself.

On to Opium War. Alas, it's from last season, so you're stuck with theft or RealAudio hijacking [2]. Wonderful episode that cleverly features 2 UK professors with Chinese names [4]:

Yangwen Zheng, Lecturer in Modern Chinese History at the University of Manchester
Lars Laamann, Research Fellow in Chinese History at the School of Oriental and African Studies (SOAS), University of London
Xun Zhou, Research Fellow in History at SOAS, University of London

That was a politically astute decision as well as didactically informed. These 3 are willing to say things that people with non-Chinese surnames are going to pussy-foot about. Even the one contributor who's voice trembles slightly when describing her parents outrage at the "unequal treaty" more or less concedes that her feelings reflect modern sentiments rather than the historic record.

In brief (sorry, you need to listen, these are my interpretations):

  • To understand this as a 19th century person might, think 1920s prohibition or 2010 cigarette management. Opium was an over the counter remedy into 20th century America; Opium was the now-lost secret to well-behaved children around the world. [3]
  • Tobacco smoking, introduced to China by the Portugese, was the technical innovation that built the opium trade. Smoking opium is much more entertaining than eating eat, so, as is forever true, tobacco was the gateway drug.
  • Lin Tse-Hsu, the Chinese intellectual, modernist, and bureaucrat who triggered the smoldering conflict, might have been pleased with the long term impact of the enforced opium trade. Lin Tse-Hu wanted China to modernize and be able to stand independently. These scholars agreed that China's defeat in the Opium Wars, and the resulting trade agreements including later trade in industrial goods, was a major contributor to the rise of modern China. So Lin Tse-Hsu lost his battle, but in losing he did achieve his true desire. I wonder if he ever realized that. History has strange lessons indeed.
  • For 19th century China the Opium War was something of a sideshow and trading Hong Kong was a trivial cost to placate the transiently powerful foreigners. The Dynasty had much bigger internal problems to worry about.
  • Opium was a currency in the China before the war, especially after the introduction of smoking (which must have increased the value of the currency ten fold), an alternative to copper. From an economic perspective the war resulted from a balance of trade problem. England was industrializing, and like all industrial nations they were switching from alcohol (locally grown) to caffeine (tea, imported from China -- coffee was not yet widely available). Alcohol was handy for dulling the pain of pre-industrial life, but industry required shorter sleep periods. England was hooked on uppers, but pre-industrial China was hooked on anesthesia. Prior to the Opium War England sent new world silver to China to buy tea, but Chinese trade restrictions meant England had nothing to sell in return. China was the "silver drain" of the world. The cost of tea was rising fast, and something had to be done.
  • Britain's attack was a mixture of governmental and private sector action. In those days the boundaries between industry and state were even thinner than in modern America -- and they're pretty darned thin here.
  • After the 1920s the Opium War, previously an annoyance primarily to Chinese intelligentsia, was transformed into a populist causes to further nationalist movements. So the Opium War not only transformed China economically, it did double duty in creating the modern Chinese nation. So it remains today, there is no doubt that many Chinese leaders, and most of the Chinese nation, bitterly resent what they know of the conflict. This is very human of course. Americans who say "Remember the Alamo" typically know very little about it, and no Chinese leader can possibly be as ignorant of history as George Bush Jr.

It's a great show and really, required listening for anyone living in the Decade of China to come. The 2008 Olympics are near, and, assuming the news is not entirely about athletic asphyxiation, you'll hear more about the two Opium Wars.

On the other hand, last season's Spinoza episode, while better than the immensely dull "William of Occam", was still disappointing. Spinoza was a radical determinist, but none of the speakers put this into a 17th century context of Calvin and Newton (billiard-ball determinism), or in the 20th century context of Einstein (non-quantum General Relativity implies rigid determinism), post-modern physics, transactional interpretations of quantum physics or even the Tralfamadorians. Melvyn was asleep at the switch on that one.

Oh, Occam? Don't bother. It reminded me too much of my work.

[1] Note to BBC. I'm just joking of course.

[2] Do any of the file sharing sites do IOT? If they do I might just try out an OS X client.

[3] One of the most memorable drug seeking patients I've encountered asked me, incidentally as our routine visit was ending, for an opium containing remedy that I think was a prescription med for children in the mid-20th century. I had no idea what it was, but of course I looked it up before prescribing. I recall she was very professional about it, she didn't get too upset when I pointed out that it wasn't really a good idea.

[4] I originally wrote "3" because I swear Laars sounded like he had a slight Chinese accent. I wonder which was his first language. The other profs pronounced his name something like "Lau".

Wednesday, October 17, 2007

Romney for torture

Romney's New National Security Adviser Said He'd Torture "In A Heartbeat"t.

If Romney were Christian, he might have theological issues with his pro-torture stace. On the other hand, I suppose the Inquisition is a relevant precedent.

I'm sure glad I'm not Republican. I can't imagine the horror of choosing the least bad bozo in this field.

British Telecom's futurist predicts end of the world in about 10 years

Ian Pearson tries to predict the future for British Telecom. I think I've previously written about his 2006 Technology Timeline.

Now he's being interviewed by Computerworld about the develop of sentient machines.

Computerworld - BT Futurist: AI entities will win Nobel prizes by 2020

...We will probably make conscious machines sometime between 2015 and 2020...

... I think that we still should expect a conscious computer smarter than people by 2020. I still see no reason why that it is not going to happen in that time frame....

... they will get very, very clever. It's kind of like a hamster trying to understand a human being. They can't simply understand the problem. How could they possibly think in the same way? It's like as if a human being is compared with an alien intelligence, which is hundreds of millions of times smarter. We don't have the right capabilities to start thinking in the same way. So, we put machines winning Nobel Prizes in our technology timeline, because we got good reasons to do that...

The scenario is very familiar to anyone who's done their essential reading. Certainly I've written about it enough. Once machines get to hamster level, much less consciousness, it's basically game over. I'm not entirely confident humanity will vanish immediately; that might depend on the AI's sense of humor.

What's novel is the date -- that's the earliest prediction I've read from anyone gainfully employed. Most of predictions are out around 2040, where I have a decent chance of being safely oblivious (whether dead or alive). In ten years I might be still standing.

I think he's wrong. Actually, if I were the praying type, I'd pray he's wrong. I'm more inclined to 2050 myself, which makes it my kids' problem. Once you move it out to 2050 there's a decent chance of prior civilizational collapse anyway, which could give Homo sapiens a bit of a longer run.

BTW, wouldn't a TV show about a top secret spy organization that goes around the world messing up AI projects be a lot of fun?

Collateral damage - Microsoft destroys an ISO standards committee

Microsoft probably didn't mean to destroy an international standards group that works with file format specifications. They simply wanted their "standard", OOXML, to be approved. There was only one problem, the committee wasn't going to play ball. So Microsoft bought the committee, bribing a large number of nations to join up.

To everyone's surprise, the initiative failed anyway.

There was, however, some collateral damage ...
Slashdot | Format Standards Committee "Grinds To a Halt"

Andy Updegrove writes:

"Microsoft's OOXML did not get enough votes to be approved the first time around in ISO/IEC — notwithstanding the fact that many countries joined the Document Format and Languages committee in the months before voting closed, almost all of them voting to approve OOXML. Unfortunately, many of these countries also traded up to 'P' level membership at the last minute to gain more influence. Now the collateral damage is setting in. At least 50% of P members must vote (up, down, or abstain) on every standard at each ballot — and none of the new members are bothering to vote, despite repeated pleas from the committee chair. Not a single ballot has passed since the OOXML vote closed. In the chairman's words, the committee has 'ground to a halt.'..."
The honorable thing for Microsoft to do now would be to pay their shills to resign from the committee. Nobody is holding their breath.

Way to go Ballmer.