Friday, October 26, 2007

The US does NOT have a problem with science education

It may have a problem with science related employment however ....

The Science Education Myth (Business Week, Vivek Wadhwa)

Political leaders, tech executives, and academics often claim that the U.S. is falling behind in math and science education. They cite poor test results, declining international rankings, and decreasing enrollment in the hard sciences. They urge us to improve our education system and to graduate more engineers and scientists to keep pace with countries such as India and China.

Yet a new report by the Urban Institute, a nonpartisan think tank, tells a different story. The report disproves many confident pronouncements about the alleged weaknesses and failures of the U.S. education system. This data will certainly be examined by both sides in the debate over highly skilled workers and immigration (BusinessWeek.com, 10/10/07). The argument by Microsoft (MSFT), Google (GOOG), Intel (INTC), and others is that there are not enough tech workers in the U.S.

The authors of the report, the Urban Institute's Hal Salzman and Georgetown University professor Lindsay Lowell, show that math, science, and reading test scores at the primary and secondary level have increased over the past two decades, and U.S. students are now close to the top of international rankings. Perhaps just as surprising, the report finds that our education system actually produces more science and engineering graduates than the market demands....

... As far as our workforce is concerned, the new report showed that from 1985 to 2000 about 435,000 U.S. citizens and permanent residents a year graduated with bachelor's, master's, and doctoral degrees in science and engineering. Over the same period, there were about 150,000 jobs added annually to the science and engineering workforce. These numbers don't include those retiring or leaving a profession but do indicate the size of the available talent pool. It seems that nearly two-thirds of bachelor's graduates and about a third of master's graduates take jobs in fields other than science and engineering...

So the data suggests we actually graduate more scientists and engineers than we have jobs for. Encouraging science education isn't going to make more scientists, any more than encouraging farming education will make more American farmers.

There's better paid work for smart American students in other domains.

Which brings us back to the farming analogy. The US is a post-agricultural nation that dose agriculture as an expensive hobby. Are we a post-science nation too?

BTW, I'm so pleased someone has done the research on this. I love to have my intuitions confirmed ...

Thursday, October 25, 2007

On saving the world - Shtetl-Optimized

I've had a post bouncing around my head for a while. It's about saving the human world. (The rest of the world will do just fine - eventually. As my 8 yo says, history just keeps happening.)

I'm going to write that post - eventually. I'll try to write the main risks down (US-China conflicts, WMDs, cost of havoc, rapid environmental collapse and resulting socioeconomic disruptions, artificial minds [1], etc) and what a geek can do about them in the age of O'Reilly.

In the meantime, a post by Scott Aaronson (yes, two t, two a)...

Shtetl-Optimized » Blog Archive » Procrastinating on the sidelines of history

... So, Al Gore. Look, I don’t think it reflects any credit on him to have joined such distinguished pacifists as Henry Kissinger and Yasser Arafat. I think it reflects credit on the prize itself. This is one of the most inspired choices a Nobel Peace Prize committee ever made, even though ironically it has nothing directly to do with peace.

With the release of An Inconvenient Truth and The Assault on Reason, it’s become increasingly apparent that Gore is the tragic hero of our age: a Lisa among Cletuses, a Jeffersonian rationalist in the age of Coulter and O’Reilly. If I haven’t said so more often on this blog, it’s simply because the mention of Gore brings up such painful memories for me.

In the weeks leading up to the 2000 US election, I could almost feel the multiverse splitting into two branches of roughly equal amplitude that would never again interact. In both branches, our civilization would continue racing into an abyss, the difference being that in one branch we’d be tapping the brakes while in the other we’d be slamming the accelerator. I knew that the election would come down to Florida and one or two other swing states, that the margin in those states would be razor-thin (of course no one could’ve predicted how thin), and that, in contrast to every other election I’d lived through, in this one every horseshoe and butterfly would make a difference. I knew that if Bush got in, I’d carry a burden of guilt the rest of my life for not having done more to prevent it.

The question was, what could a 19-year-old grad student at Berkeley do with that knowledge? How could I round up tens of thousands of extra Gore votes, and thereby seize what might be my only chance in life to change the course of history? I quickly ruled out trying to convince Bush voters, assuming them beyond persuasion. (I later found out I was wrong, when I met people who’d voted for Bush in 2000 but said they now regretted their decision. To me, it was as if they’d just noticed the blueness of the sky.)...

...In the end, though, the Nadertrading movement simply failed to reach enough of its target audience. The websites put up by me and others apparently induced at least 1,400 Nader supporters in Florida to vote for Gore — but 97,000 Floridians still voted for Nader. And as we know, Bush ended up “winning” the state by 537 votes...

Ah yes. Nader. There are no words.

But. We're not alone. Not completely.

[1] Even I wince when I write that. It's just so geeky. Tough.

Wednesday, October 24, 2007

Fragments of history: Manchester air raid shelters in abandoned mine shafts

My mother was a girl in Manchester England during WW II. Today, after my usual stories of what the children were up to, she mentioned that as a child, early in WW II, she enjoyed the family time in the backyard bomb shelter. These were hand dug pits with concrete bottoms, metal siding and roofs, and dirt on top. It's not clear to me how much protection these things provided, but I imagine the psychological benefit was significant.

We spoke a bit of general WW II shelter design, and she mentioned that among the Manchester shelters were abandoned mine shafts. Coal mine shafts in particular. These were group shelters, a step below subways and the like.

It seems to me there ought to be some stories about taking shelter in an abandoned coal mine. Pleasant no, memorable yes. The search terms should be specific enough to find something.

Alas manchester "air raid" shelter coal "mine shaft" didn't come up with much today, though it did catch another story of a lost world.

In a day or two, of course, the search will find this post.

Sheila Cox says hello.

1940 was 67 years ago. Getting on half way to the American Civil War.

Imagine someone, or something, searching on this topic 70 years from now. It's rather hard for me to imagine what they might be like.

Financial market turmoil: The problem is obvious, the fix is hard

The current explanation of why the trillion dollar securities market is in turmoil, which may push us into recession, is that mortgage-backed security pricing was inaccurate because credit-rating agencies didn't do their job.

Gee, I wonder why they didn't do their job? It's a mystery. Needs lots of investigation.

Or not ....

Robert Reich's Blog: Why Credit-rating Agencies Blew It: Mystery Solved

Recently, Treasury Secretary Hank Paulson sharply criticized credit-rating agencies for failing to recognize the risks in hundreds of billions worth of mortgage-backed securities whose values continue to plummet as home-loan defaults grow.

... obvious why credit-rating agencies didn't blow the whistle. (They didn't blow the whistle on Enron on Worldcom before those entities collapsed, either.) You see, credit-rating agencies are paid by the same institutions that package and sell the securities the agencies are rating...

As I've noted in other contexts, it's not necessary for anyone at the credit-rating agencies to actually conspire to create false results. It's simply emergent behavior -- natural selection in action. Competent honest people will fail to make their incentives, so they'll either become incompetent, move to other businesses, or become dishonest. It probably only takes 1-2 years of this type of conflict of interest to create agencies that are a mixture of incompetent and dishonest.

So the interesting question is not why the credit-rating agencies are dishonest and incompetent. That's obvious.

The interesting question is why Congress never fixed this back in the Enron days. The division was proposed then, probably by Robert Reich but also by the usual suspects (Krugman, DeLong, etc).

Oh wait, we know the answer to that one too. Congress is incented by reelection and retirement jobs, which are paid for by the .. the credit-rating agencies and securities company ...

Hmm. So why does Congress get away with such incompetency and corruption ...

Oh wait, we know the answer to that one too.

The newspapers don't write about this and the talk shows and television news groups don't educate the public.

I'll leave the rest of the "Oh, waits" to the reader.

The problem is the American voter.

My answer to Jeff Atwood's question: Why Does Software Spoil?

Jeff Atwood writes Coding Horror, one of my favorite technical blogs. In two posts he asks familiar questions (remember, he's from the XP world): Coding Horror: Why Does Software Spoil? and "Are Features the Enemy".

I think the answer is pretty straightforward. The way we pay for software induces perverse incentives. Here's the comment I wrote to his blog:

It's the business model.

If we rented software then there'd be a steady revenue stream for developers, and a lesser feature incentive.

Since we buy software (sort of) developers have perverse incentives. They have to "break" the prior version by not maintaining it, introducing new file formats, and they have to add features to hide the fact that the updates are driven by vandalism, not value.

Of course software rental completely screws customers unless the file/data  formats are completely public and interoperability is assured. I'm sure customers will realize how important that is.

Thirty years from now.

Viruses and worms now help drive this model. Security issues make it much easier for companies that own both the OS and the software products to "break" older software, thereby forcing updates. Feature addition is simply a way to make customers accept the forced update. If Microsoft didn't add features, but forced Office updates by making old versions run less well, customers would scream.

These kinds of things don't really have to be planned. They're simply emergent. It's just the way natural selection operates on business.

Of course the only thing worse than the current business model is the combination of software leasing and proprietary data formats, including proprietary metadata models. (Ever try extracting and moving all your iPhoto data?)

Happily customers would never fall for that trick.

Oh, wait ...

Monday, October 22, 2007

Sibling Neandertal and the end of the modern human

I asked the other day "what was Homo sapiens doing for 115,000 years?"

I didn't expect John Hawks to answer me so quickly (emphases mine):
... we have undergone light-years of change since the last Neandertals lived. This is not a question of "modern human origins" anymore. We can now show that living people are much more different from early modern humans than any differences between Neandertals and other contemporary peoples. I think that "modern humans" is on its way to obsolescence. What matters is the pattern of change across all populations. Possibly that pattern was initiated by changes in one region but the subsequent changes were so vast that the beginning point hardly matters... [From John Hawks Anthropology Weblog : 2007 10]
Most of the popular books on human origins I've read emphasize how similar we are to Homo sapiens of 60,000 years ago. It appears that meme is on its last legs. We look a lot like our ancestors, but our minds are very different.

To answer my original question -- Home sapiens spent the last 115,000 years turning into an animal that could write. It wasn't easy.

It's tempting to suggest we need to name a new human "species" that was launched 15,000 years ago, but these arbitrary demarcations are increasingly unconvincing, just as misleading as the concept of "the modern human".

The more we learn about the evolution of the human mind the more fluid it seems. Our modern world has many more niches for exotic minds than the ancient world; if we live long enough we'll fill those niches.

One day we might need interlocutors -- even between speakers of the same language.

The rest of Hawks article is well worth a close read. He uses a recent discovery from the Neandertal genome to support a favorite thesis of his -- that we are part Neandertal.

The Bush/Cheney regime in a nutshell

To a first approximation, this summarizes the entire reign of the modern GOP...
Suicide Is Not Painless - Frank Rich - New York Times

...The inspector general also assured Congress that neither Donald Rumsfeld nor Paul Wolfowitz knew anything about the crimes. Senators on the Armed Services Committee were incredulous. John Warner, the Virginia Republican, could not believe that the Pentagon’s top two officials had no information about “the most significant defense procurement mismanagement in contemporary history.”

But the inspector general who vouched for their ignorance, Joseph Schmitz, was already heading for the exit when he delivered his redacted report. His new job would be as the chief operating officer of the Prince Group, Blackwater’s parent company....

Sunday, October 21, 2007

Hibbing and the Hull Rust

Hibbing is a northern MN town with a remarkable history (the 10/07 Wikipedia article doesn't do it justice).

If you're ever wandering Minnesota's Iron Range, I recommend a visit to the outskirts.

Yes, as seen on screen capture of a Google sat map the town of Hibbing (bottom right of the image) is dwarfed by what was once the world's largest open pit mine - the Hull Rust:
".... more than three miles long, two miles wide and 535 feet deep. This man-made "Grand Canyon of the North" was the first strip mine on the Mesabi Iron Range. The amazing view continues to grow as the Hibbing Taconite Company Mine expands its mining operations.

Since 1895 more than 1.4 billion tons of earth have been removed on its 2,000 acres of land, and more than 800 million gross tons of iron ore have been shipped from the mine. At peak production in the 1940's, as much as one quarter of the ore mined in the United States came from the Hull Rust Mine."
Hibbing was a boom town before WW I, with 60 saloons, about 16,000 people, an Opera House and a gorgeous Carnegie Library. That's about the time folks realized the town was sitting on vast amounts of iron. The town was demolished, it's not clear it every truly recovered. The Greyhound bus company started then, busing miners from their new, more distant homes.

There's a small park, little known to tourists, documenting a bit of the old town. The park itself is falling into history, it looks like a political gift that has since been neglected. There's a lot of that in the range.

I took a hand-held panorama picture from the Hull Rust lookout and stitched it together. It was a hack job with lots of artifacts and a curious triplication in one join, but it does capture something of the view from the lookout (click to see larger image, you can download the full res as well):

From The Iron Range
Update 10/27: Here are some more photos from our Range trip, see also: The Wellstone memorial and the path not taken.
The Iron Range

I am 113810027503326386174. And 578762461. At least.

Today I have been re-christened 113810027503326386174. It is the ID Google assigned to the persona associated with Gordon's Notes and other blogs. I assume it will be the foundation for Google's future identity management services.

If you do various straightforward manipulations based on the geometry of the Incan pyramids, you will extract the number 666.

That persona also has my primary GMail account, though at last count I had at least four GMail personas (associated with various Google App domains -- nothing fishy about it).

I will need to add this new number to the page where I park all my public and related personas.

Incidentally, I read in Slashdot today that Facebook, where I am 578762461, is moving to 64 bit identifiers. That should cover the first ten minutes of post-Singular identify explosion.

I thought of 113810027503326386174 and tatoos, but that's not something to joke about.

It will take me a while to memorize these new names. Mnemonic anyone?

Update 10/2/08: It finally occurred to me that this is my "number of the beast". It is, after all, clear where Google is going, it's a number, and it's mine. So now I can add the religion tag to this post. As of today there are 0 hits on "Google Profile" and "Number of the Beast". Maybe I can tweak that a bit ...

Update 4/25/09: Well, that was quick.

http://www.google.com/s2/sharing/stuff?user=113810027503326386174 no longer works! Instead the new URL is www.google.com/s2/profiles/113810027503326386174 which now redirects to http://www.google.com/profiles/jfaughnan.

Update 9/19/11: My G+ posts incorporate this ID

Saturday, October 20, 2007

iPhone Jan 2008 and, at last, the end of Palm

Apple doesn't make the mistake of thinking thinking the customer knows best. Mostly this works, but sometimes it leads to perverse obstinacy. iPhoto won't import image Libraries. Aperture can't edit date metadata. Apple can be incredibly obtuse.

So it was plausible, though it bordered on the crazed, that Apple intended to own the iPhone completely, and to reserve all software production to Apple.

It felt even more plausible when Apple talked about AJAX apps like they were a credible solution, tightened its ringtone control, went to war with Apple geeks, and even Nokia taunted the lion.

Plausible enough, that fear of Apple's choices meant I couldn't get an iPhone until Apple met my personal requirements. I've been sitting on my wallet, even contemplating a BlackBerry. Yes, even contemplating another year with a Motorola RAZR, a replacement Palm Tungsten, and an iPod.

Then, last week, Apple promised a true iPhone/iTouch SDK in February 2008 and Pogue wrote:
... Here's my view of the timeline: Leopard ships October 26th. Apple announces a new iPhone model or models at Macworld Expo on January 15th. The models ship along with an updated OS that's more fully Leopard for iPhone as a software update by early February. The iPhone SDK appears shortly thereafter...
It seems that Apple has chosen the iTunes signed application distribution model, with Apple taking a percentage of every sale. We don't know if they go to a software subscription model.

I'm good with that. If they'd announced this a month ago I'd have an iPhone now.

So was this the plan all along, or did Jobs change his mind? I suspect it was more or less the plan, but if there'd been less screaming Jobs might have tried to own the whole thing.

So why the long delay between product release and SDK announcement? Probably the 10.5 delay, which arose at least partly from the decision to take the iPhone to market. Key people were pulled from 10.5 to the iPhone, then pulled from the iPhone back to 10.5. The last step meant there was no time to do an SDK and improve the iPhone APIs, and perhaps Jobs didn't want to announce SDK plans six months ahead of time.

They also needed to get everything working flawlessly with 10.5, which was hard to do before 10.5 went public.

We're close enough to Jan 15th I'll wait to see if Pogue's right, but clearly this is great news for Apple geeks.

Oh, and what does this have to do with Palm?

As long as it seemed possible that Apple was going to keep the iPhone a strictly entertainment-oriented device, Palm still had a ray of hope. That light just went out. Assuming Apple doesn't screw up, and I don't think they will, a large developer community will fill any Palm functionality that Apple doesn't provide.

Palm, at last, is finished.

Update: Daring Fireball has a detailed discussion of a hypothetical but plausible signed distribution model and how it would work for developers.

Update 10/21/07: Glenn Fleishman of Tidbits covered this topic with more nuanced detail. Nokia's "Symbian Signed Application" program might be Apple's model. Glenn also mentions something I forgot, that Leopard's signed application model is probably a prerequisite for the the iPhone's application distribution mechanism. No mention of what this has to do with current iPhone applications "running as root", but it sure feels like the iPhone was originally designed to run 10.5, and that the shipping version was put together in a hell of rush.

It's a miracle the 1.0 iPhone works as well as it does. One day someone will write a heck of a geek book about the iPhone's creation. I'm imagining it involved flogging and illegal stimulants.

Update 10/21/07: Michael Tsai says we shouldn't be so trusting, the word "SDK" need not mean the ability to deliver the class of products Apple delivers. So if we don't hear more details with iPhone 2.0 in January, waiting for February would be a good idea.

Fallows says No to Mukasey

If Mukasey is rejected Bush will use some in-place stooge to run the department through 2008. No matter, Fallows is right:
James Fallows (October 19, 2007) - Mukasey: No

....A specific point: the 'waterboarding' outrage. As is now becoming famous, Mukasey said this, when asked by Sen. Sheldon Whitehouse whether waterboarding was constitutional: “I don’t know what is involved in the technique,” Mr. Mukasey replied. “If waterboarding is torture, torture is not constitutional.”

Either way you slice it, this answer alone is grounds for rejecting Mukasey. If he really doesn't 'know what is involved' in the technique, he is unacceptably lazy or ill-informed. Any citizen can learn about this technique with a few minutes on the computer.* Any nominee for Attorney General in 2007 who has not taken the time to inform himself fits the pattern of ignorant incuriosity we can no longer afford at the highest levels.

The cult of IQ

It occurred to me that Greens and Grays and IQ is a good excuse to say something about the cult of IQ (that is, whatever IQ tests measure).

Very quickly (because the kids are getting restless):
  1. Whatever IQ score means, it's only loosely correlated with most measures of "success" - at least in our world. It certainly doesn't correlate with number of genetic descendants, but it doesn't correlate with leadership success or even wealth either. There are lots of poor and/or unhappy members of Mensa, and lots of very successful entrepreneurs with unremarkable IQ scores.
  2. Whatever value IQ might have in today's world, it will probably have about as much value in 30 years as muscle has had since the steam engine was fully implemented. The strong become weak, the weak become strong; it's the selfish justification for "compassion".
  3. There's not much evidence that IQ correlates with either insight or judgment. I suspect one day we'll figure out they have pretty different physiology, evolutionary history, and adaptive advantages. I don't think George Bush's problem is that he's dumb, his old SAT scores tell us that, at least as a teen, he had a quite decent IQ.
IQ is useful, but generally overrated.

Thursday, October 18, 2007

So what was Homo Sapiens DOING for 115,000 years?

After humans were hunters, but before they were farmers, they learned to fish ...
ASU team detects earliest modern humans | ASU News

After decades of debate, paleoanthropologists now agree the genetic and fossil evidence suggests that the modern human species – Homo sapiens – evolved in Africa between 100,000 and 200,000 years ago...

... “Generally speaking, coastal areas were of no use to early humans – unless they knew how to use the sea as a food source” says Marean. “For millions of years, our earliest hunter-gatherer relatives only ate terrestrial plants and animals. Shellfish was one of the last additions to the human diet before domesticated plants and animals were introduced.”

Before, the earliest evidence for human use of marine resources and coastal habitats was dated about 125,000 years ago. “Our research shows that humans started doing this at least 40,000 years earlier. This could have very well been a response to the extreme environmental conditions they were experiencing,” he says.

“We also found what archaeologists call bladelets – little blades less than 10 millimeters in width, about the size of your little finger,” Marean says. “These could be attached to the end of a stick to form a point for a spear, or lined up like barbs on a dart – which shows they were already using complex compound tools. And, we found evidence that they were using pigments, especially red ochre, in ways that we believe were symbolic,” he describes.

Archaeologists view symbolic behavior as one of the clues that modern language may have been present. The earliest bladelet technology was previously dated to 70,000 years ago, near the end of the Middle Stone Age, and the modified pigments are the earliest securely dated and published evidence for pigment use.

“Coastlines generally make great migration routes,” Marean says. “Knowing how to exploit the sea for food meant these early humans could now use coastlines as productive home ranges and move long distances.”..
From the press release alone it seems the significant observations were that early Homo Sapiens may have evolved by the ocean. Hard to know if that explains why we are, for a primate, terrific swimmers [1]. The study also moves a key cognitive task, bladelet creation, back another 60,000 years.

So if humans could manufacture bladelets 125, 000 years ago, what the heck were they doing for 115,000 years prior to conquest of the planet? That's a heck of a long time in the context of human evolution

We have a lot in common with those early Homo sapiens, but I suspect our minds are pretty different.

Update 10/20/07: I remembered this was called the "aquatic ape theory". It may have been popular in the 1970s. It's suffered from some eccentric proponents over the years.

Greens and Grays and IQ

James D. Watson appears to be a member of the Bell Curve club. He's also very old, and I suspect his own IQ is nowhere near where it once was.

Whatever the cause of Watson's opinion, the topic has lead to the usual questions about the genetics of "whatever it is that IQ tests test". I read the NYT response as relatively cautious about the influence of post-natal environment on IQ. It could be read as acknowledging that IQ is largely determined by genes and the intrauterine environment, with very little other environmental influence. I think that is roughly the current scientific consensus.

I've written about this before; it's a fascinating if unsettling topic. Ashkenazi Jews and South Koreans seem to be unusually good at clever things, and for the former there's even some suggestive genes to inspect.

But what of it?

Let us assume the human race was divided into Greens and Grays, and that Greens scored 20 points higher on IQ tests than the Grays. This would translate into lots of Green wealth and power.

What would the Greens then owe the Grays? What do the strong owe the less strong? That, to me, is the more important question.

I, of course, am a good commie. From each according to their ability, to each according to their need. Adjusted for human limitations of course.

There are no American professional hockey players

Or so I might think, judging from the  U.S. Hockey Hall of Fame Museum Inductees. I gather from the absence of the inevitable inductees like Rocket Richard and Wayne Gretzky that only US born players can join up. It's a paltry list, and I don't recognize any of them from the years I followed hockey.

I think they need to bend the rules a bit. Wayne married a American after all, and for all I know he's naturalized by now. Heck, what about Jacques Lemaire, now residing in my hometown. Surely Jacques must have a green card ...