Monday, January 03, 2011

America 2011: exploring the boundaries of democracy and capitalism

Reading about the fall of Detroit, the ranting of the GOP's Tea Party House, ubiquitous fraud and corruption, and the emergence of the corporate person, it's easy to imagine that we're going where no nation has gone before -- testing the limits of both democracy and capitalism.

But has any nation been here before? How long did Athenian democracy last? Not all that long actually ....

Athenian democracy - Wikipedia, the free encyclopedia

Athenian democracy developed in the Greek city-state of Athens, comprising the central city-state of Athens and the surrounding territory of Attica, around 508 BC. ..

... The greatest and longest lasting democratic leader was Pericles; after his death, Athenian democracy was twice briefly interrupted by oligarchic revolution towards the end of the Peloponnesian War ... It was modified somewhat after it was restored under Eucleides; the most detailed accounts are of this fourth-century modification rather than the Periclean system. It was suppressed by the Macedonians in 322 BC ...

The Peloponnesian war lasted from 431 BC to 404 BC. So Athenian democracy might have lasted as little as 100 years. If you count periods of restoration you might get to 160 years. American "Democracy" [1] is about 235 years old.

Going where no nation has gone before ...

See also

[1] Assuming one starts with white male landowner voters.

Product suggestion: Integrated attic stair and stair frame seal

When I'm looking for a product, I first imagine how it would work. Then I search for it.

Often I find exactly what I imagined. Sometimes, however, it doesn't exist.

Today, in the third coming [1] of the Minnesota ice dam, I went looking for a replacement for our old attic stairs.

I imagine an integrated stair/insulation system. This would be a metal frame that would sit into our attic floor. It would hold traditional folding/sliding stairs. In addition it would have a "lid" atop the frame with a latch and a door lifter.

To enter the attic one would go up the stairs and then open the latch. The lid, sealed by high quality rubber seals, would open up with a slight push. The door lifters would keep it open. A string would be used to shut it.

A simple idea, but this crude system is the best I could find - Attic Stair Insulation Insulsure Attic Tent AT-2 Model. So we'll have to have something custom made for us.

If you're a bored manufacturer, you should be able to put this together in a few months and sell it next year. Please send me a link and I'll update this post.

[1] Third in my residence here. Prior bad years were 1997 and 2004, but 2010-2011 may be the worst of all.

Sunday, January 02, 2011

Detroit in ruins - we need salvage law for this wrecked vessel

The Observer is highlighting Detroit in ruins, a series of photographs by Yves Marchand and Romain Meffre.

Think of it as a slow motion but more devastating version of Hurricaine Katrina.

The most stunning image, for me, was the picture of the east side public library:

Screen shot 2011-01-02 at 7.30.58 PM.png

They abandoned a building full of books.

Others pictures show unsalvaged Art Deco chandeliers, classrooms with anatomic models, and piles of documents in an abandoned police station. Some of them are doubtless staged (this is art, not documentary), but did they really stage the library?

Can we organize a team of crack commandos to liberate the old books?

At this point, maybe we should treat Detroit like a wrecked ship open for salvage. Ten percent of revenue on recovered goods goes to the city, the rest to the winner.

See also:

Resolution 2011: Managing complexity

I'm good with resolutions. Mostly because I know how to pick 'em. I make 'em doable.

Consider sleep. I like to exercise, but in my life sleep is more important. So I've resolved to sleep at least 52 hours a week [1]. I think I can do that if I track the numbers and identify where I fall short.

That's one for 2011. The other resolution is about managing technological complexity.

I've been on a complexity reduction kick for a few years , but this year my focus is on technological complexity. I'm starting with the plausible assumption that we all have a personal "complexity budget". Some of us can manage more complexity, others less, but we all have memory and processor limits -- even the AIs among us.

We can spend our complexity capacity figuring out how to adjust product development to available capacity, or we can spend it figuring out what parts of SharePoint are worth using [2]. Both tasks will be equally draining.

At some points in my life I had complexity capacity to spare - perhaps because I wasn't using it wisely. That's not true now.  Gains from improved productivity techniques [3] and growth of mind [4] are offset by entropic neurons. Most of all though, my life overflows. I'm not complaining -- it's an overflow of good stuff. It means though, that I need to use my complexity capacity wisely.  I can't be spending limited firepower figuring out which of my 15 Google identities is running a feedburner bot linked to a pseudonymous twitter account.

It's not easy to reduce technological complexity. It often means making do with less; giving up tools and solutions I really like. Often it means declining new incrementally better improvements -- because a 10% gain isn't worth the learning curve and failure risk. Sometimes it means giving up on old tools that still work but are increasingly unsupported. Yes, it's a lot like software portfolio management.

Looking at how technological complexity grows in my life I can identify four broad causes....

  1. Taking on too many simultaneous tools and solution sets.
  2. Failure to clean up. Ex: Abandoned user identities, google accounts, etc. Creates noise and clutter.
  3. Premature adoption of technologies and solutions. Ex: Any new OS X release, any new Apple hardware, trying to get Contact synchronization to work with both Google Contacts, OS X Address Book and iPhone, OS Spaces. Above all - Google Wave.
  4. Prolonged use of increasingly unsupported solutions in a world of forced software evolution [6]. Ex: document-centric web tools, wristwatches, printers.

I've gotten better at the first one, but the next three all need work. The 3rd and 4th are particularly tricky. My heavy use of Google's multi-calendar sync solutions is clearly premature [5], but it's been very valuable and relatively bug free. On the other hand, I think my jury-rigged Contact integration solution may be a bridge too far. On the other hand, I stuck with Outlook's Notes feature long after it was clearly dead.

Cleaning up is the least interesting measure, but one of most important. There are 1,575 entries in my credentials database, extending back to August 1995. Sure, most of those sites are long gone, but I still have too many active identities and credentials. I need to gradually cull a few hundred.

This project should keep me busy for a while. It will, of course, suck processors in the short term, but I expect near term returns and long term gains. Feels like a good resolution target.

-- fn --

[1] It helps that recent research suggests that amyloid clearance occurs primarily during sleep, and I'm speculating that a 10-20% decline in amyloid clearance translates to 10 extra years of dementia.
[2] The wiki and, in the absence of alternatives, the document store. Don't touch the rest, even the stuff that seems to work is poison. 
[3] At my stage "GTD" is child's play.  I use a mongrel of Agile development planning methodology, GTD/Franklin, and a pocketful of tricks including calendar integration across family and work.
[4] For quite some time mind can grow even as brain more or less sucks wind. Not forever, but for a time. 
[5] The UI for configuration multiple calendars has been bizarrely obscure for about two years. This is not mainstream.
[6] It's predator-prey stuff. Software evolution was much more leisurely before human-on-human predation took off with hacks, frauds, identity theft, malware and the like. Now old bucks have to keep moving, or we become wolf chow. Software cycles are faster, products die quickly, and we have to keep buying whizzier hardware. If not for malware, the curated world of iOS would still be years away.

See also:

Saturday, January 01, 2011

Why the United States Postal Service should manage our primary digital identity

For a non-expert, I do a fair bit of ruminating about the relationships between identities, credentials, and avatars/facets. Today a bug related to Google's (covert) Identity Integration initiatives, a recent flurry of stories on the endtimes of password based security, and the earth's orbit have got me chewing again.

I'll deal with the earth's orbit by making my solitary 2011 tech prediction. 2011 will be the year of two factor authentication and the gradual realization that management of digital identities is too important to be left to Google, Amazon and especially Citicorp, Facebook, and AT&T/Verizon.

So if we can't rely on Google (or Facebook) or Citicorp to manage our digital identity, including claim resolution and identity control, who can we rely on? What are the other alternatives, assuming that almost none of us will run an identity service out of our homes?

Obviously, government is an option. The (US) Federal government, for example, makes a robust claim on my identity. That claim, however, is so robust I would prefer to separate my obligatory IRS identities from all other identity related services. In any event direct US government identity management is a political non-starter. The right wing will start ranting about beastly numbers and rationalists will fret about the day Bush/Cheney II takes power.

That leaves business entities with strong governmental relationships, extensive regulation, and a pre-existing legal framework support that could be extended to support identity management.

An entity like, for example, the United States Postal Service (USPS).

You laugh. Ok, but consider the advantages:

  1. The USPS has been in the business of managing confidential transactions for centuries.
  2. There are post offices in every community that could support the person-present aspects of identity claims.
  3. It's a regulated quasi-governmental agency that already exists.
  4. The USPS manages passports
  5. Much of the legal framework used to manage mail and address information could be extended to manage digital identities.
  6. The USPS is dying and is desperate for a new mission.

I admit, it sounds crazy.

Except ... I'm far from the first person to think of this. It was proposed by (cough, choke, gag) Michael Chertoff ...

... former Department of Homeland Security Secretary Michael Chertoff ... mused that the USPS was ideally situated to take part in the evolution of the government’s role in validating identity. He points out that the Post office is already the primary issuer of passports – an extremely important piece of personal identity. In the speech he expands on that model as follows: “one of the things I hope to see is, as the Post Office re-engineers itself over the next, you know, few years, they increasingly look at whether they can be in the business of servicing identity management. They can – because every town has a post office.”....  DHS: Remarks by Homeland Security Secretary Michael Chertoff at University of Southern California National Center for Risk and Economic Analysis of Terrorism Events

I can't believe I find myself agreeing with Chertoff, but there you go. What a way to start 2011.

See also (Gordon's notes unless otherwise noted);

[1] Incidentally, now that my kateva.org Google Apps users have Blogger privileges, and since Blogger is supposedly an OpenID provider, I'm thinking of implementing this using Blogger/Google Apps/Kateva.org

Update 1/8/11: A few days after I wrote this news emerged of a federal identity and certificate management initiative. Maybe I'm psychic.

Thursday, December 30, 2010

iPhone apps and the Philip Morris business model

Once upon a time tobacco companies distributed abundant free samples. It's a good business model when you're selling an addictive product.

This holiday the kids have been piling up iOS games on their phones, mostly at $1 each. Since FairPlay DRM works on an iTunes/account basis, each app goes to four (soon five) iOS devices.

At 20 cents/app/phone this looks like a heck of a bargain -- but appearances are misleading. Increasingly the apps are entry points to a series of in-app purchases [1]. The real price is a cumulative sum; I'm guessing the average cost is more like $5 than $1. iOS games are rediscovering the Philip Morris business model.

This isn't necessarily a bad thing - for games. It means we can try quite a few games, and only spend money on those we really like.

It's certainly not a bad thing for experimental economists! The App Store is a wonderful model for exploring pricing strategies ...

[1] I don't know how FairPlay treats these. Are they tied to a particular devices or to the account? I suspect the former, which means the FairPlay redistribution (our five devices) becomes a feature, not a problem.

Tuesday, December 28, 2010

A quiet revolution in naming and framing disorders of the mind

Victory: The war against 20th century psychiatric diagnoses is all but won. It's been a long time coming, this rebellion has roots going back to the 1970s (not all of them equally evidence based). Things have really picked up over the past decade.

Our misclassification of disorders of the mind has led psychiatry, and neuropsychiatry, into a frustrating blind alley. Now that we're realizing our mistakes, we can start to make real progress.

There are significant implications for our understanding of disability in a technocentric society, and for our glacial rethinking of the meaning of responsibility. Those memes are still baking ...

Monday, December 27, 2010

The history of post-neolithic humanity in 10 minutes - DeLong's annual Econ 1 post

Brad DeLong, my favorite economist, has published the latest edition of his annual Econ 1 Berkeley: September 29 2010 Economic Growth Lecture. It's his gift to the rest of us, and a fine gift it is [1]. This is why I love blogs.

In about ten minutes anyone can catch up on the most current synthesis of the past 12,000 years of human history; from the deep history of the Neolithic to modern IT and the rise of India and China. He stops just short of putting IT on the same level as the development of language -- too soon to tell.

That leaves unspoken the period from about 150,000 BCE to 10,000 BCE and especially 30,000 to 12,000 BCE. This is deep history, and 2010 has been a breathtaking exploration of the pre-neolithic. In just the last eight months we've learned we moderns are a mongrel mix of Denisovan, Neandertal and, probably a lot of other pre-neolithic human "breeds". Out of that churning mix came something astonishing, horrifying, and (we currently believe) completely new to the earth - the technocentric animal.

Exciting times.

See also:

Some of my stuff

[1] The next time I'm out SF way, I'm going to see if there's some way to sneak into a DeLong lecture. Maybe he sells tickets?

Saturday, December 25, 2010

The Chinese net and machine translation

Chinese, for a time, will pass English as a net language. The authors imply that a predictable course, but they forget that the world's largest english speaking nation is India. So things may go back and forth for a while.

Even so, this would be a good time to make English-Chinese machine translation actually work.

Let me say that again with a bit more emphasis.

Working, bidirectional, English-Chinese machine translation may be the single most important technological goal of this decade.

I'll leave it to the reader to imagine why it will be so important. If you think about it for a few minutes, you should be able to come up with a good list.

Is this an achievable goal? I'm not sure. On the one hand we already have reasonable translation between closely related european languages. On the other, Google's current English-Chinese translation is worthless. The only time I've seen it work was when the Chinese article was a translation of an interview conducted with an English speaker. I know very little about the field, but I wonder if Google's statistical approach has run into a brick wall. Effective English-Chinese machine translation may require other approaches.

I'm not sure, but I would bet we'll see it work within ten years. As we get closer, I wonder if we'll start to see development of writing styles that are easier to translate. Any (typically unilingual) English speaker who routinely works with non-English speakers learns to speak in a form that's easier to translate. Sentences are shorter. Syntax is simpler, but vocabulary is more precise and often more technical. There are fewer short words with multiple meanings, and more polysyllabic words with single interpretations. Depending on the non-english speakers language, certain phonemes are avoided. Compositional words, made up of reusable terms, may work better than novel strings.

The resulting form is certainly English, but it is a technical and streamlined form of English.

Obviously, there are equivalent versions of written and spoken Chinese.

I suspect that as English-Chinese machine translation starts to become useful, these modified forms of written expression will play an important role.

Good luck with this one Google. Get it right!

Thursday, December 23, 2010

Digital cameras 2011-2017

David Pogue writes today about affordable, bigger-than-pocketable, cameras with reasonable ISO 800 images. That made me think about how the digital camera market breaks down at the end of 2010. There are about 5 markets left today, cameras like the Canon G series are being replaced by these newer cameras.

  1. iPhone 4 and equivalent
  2. Very compact simple cameras. (Canon, everyone else)
  3. Non-pocketable smaller-than-SLR fixed lens cameras (Canon, Panasonic, Samsung)
  4. SLRs (Canon and Nikon)
  5. MILC - Mirrorless interchangeable lens cameras (everyone but Canon and Nikon)

The first category is dead. Why buy an ultra-compact when you own an iPhone 4 or the equivalent?

So that leaves 4 "camera" categories for 2011-2012. The middle two occupy the same two niches that have existed since the 1960s ...

  1. iPhone 4 and equivalent.
  2. Non-pocketable smaller-than-SLR fixed lens cameras -> same niche as the film rangefinder
  3. SLRs (Canon and Nikon) -> same as film SLRs
  4. MILC

The last category is the interesting one. We've been expecting the MILC for the past decade, so it's hardly a surprise. Manufacturers are now ready to chop the prism and the mirror. By 2012 Nikon and Canon will have MILCs that work with their current lenses, and low end SLRs will fade away.

More than the SLR is at risk; MILCs can be much smaller than the an SLR. That doesn't leave much room for the "rangefinder". So by 2017 we'll have only two categories;

  1. iPhone 8 and equivalent
  2. MILC (Canon and Nikon)

Next year I expect to replace my 4-5 yo Digital Rebel XT with a 2011 model that bloody well better shoot ISO 1200 as well as my current camera shoots ISO 400 [1]. That will almost certainly be the last SLR I'll buy.

[1] More pixels means I can use less optical zoom, so larger F-stop, so the overall light sensitivity per end-image pixel is greater than the ISO difference alone. Needless to say, I'm not impressed with megapixels. I want photons.

Monday, December 20, 2010

Which animal throws best? It's a mystery.

Which animal is biggest? Smallest? Fastest?

Good questions, and there are lots of answers. It's the sort of thing we talk about over breakfast. This morning, feeling clever, I asked my kids which animal throws best.

It's an important question.  A large animal that can throw a hefty object has a terrific attack and defense advantage against prey and predator alike. A social animal capable of throwing a hefty rock at 80 kph could, you know, rule the world.

There are very few animals that can throw well. It's quite a trick. To do it well an animal has to be bipedal, it needs really good binocular vision, and it has to be able to calculate trajectories.

Gorillas can throw things, but I don't think they compare to the naked ape.

Of course the kids didn't trust my answer. They wanted confirmation.

That's when I uncovered the mystery. The mystery of this search ...

"which animal throws best" - Google Search: "No results found for 'which animal throws best'."

No freakin' results?! Google is telling me nobody has asked this question with this phrasing on the net?!

That's not possible.

Is it?

Sunday, December 19, 2010

Gordon's scale of corporate evil - 2nd edition

A post on why Facebook, Netflix and Amazon will join the Google-Apple truce reminded me it's time to update Gordon's 15 point scale of publicly traded corporate evil.

Here's the current list, with notes on who's moved up and down ...

  1. Philip Morris: 15 (defines the upper bound)
  2. Goldman Sachs: 14 (up - working for SPECTRE).
  3. Exxon: 13 (global warming advocacy)
  4. AT&T, Verizon: 12 - up two
  5. Facebook: 11 (down 1 - still most evil tech company)
  6. For profit health insurance companies: 11
  7. Microsoft: 10
  8. Average publicly traded company: 8
  9. Google: 5 (down one)
  10. Apple: 5
  11. CARE International: 1 (They're not a PTC, so this is merely a non-evil reference point)

The greater enemy: why Facebook, Netflix and Amazon will join the Google-Apple truce

The Google-Apple war officially ended in September. I've not seen any convincing explanation of why the war ended; my best guess is that both companies realized that Verizon, AT&T and Comcast are the greater enemy. The three big carriers want to bring the cable TV business model to the net through the IP Multimedia Subsystem ... (emphases mine)

Mobile Carriers Dream of Charging per Page | Epicenter | Wired.com:

... The companies, Allot Communications and Openet — suppliers to large wireless companies including AT&T and Verizon — showed off a new product in a web seminar Tuesday, which included a PowerPoint presentation (1.5-MB .pdf) that was sent to Wired by a trusted source.

The idea? Make it possible for your wireless provider to monitor everything you do online and charge you extra for using Facebook, Skype or Netflix. For instance, in the seventh slide of the above PowerPoint, a Vodafone user would be charged two cents per MB for using Facebook, three euros a month to use Skype and $0.50 monthly for a speed-limited version of YouTube. But traffic to Vodafone’s services would be free, allowing the mobile carrier to create video services that could undercut NetFlix on price....

... “It certainly is exactly the thing we have been warning the companies will do if they have the opportunity and explains why AT&T and Verizon are so insistent that the wireless rules be solely about blocking and not anything else,” said Public Knowledge legal director Harold Feld....

... The ideas don’t look too different from the way cable companies price their video offerings, with different packages of programming at different levels.

... “I have been saying that this is where they want to go for a while,” van Schewick wrote to Wired. “The IP Multimedia Subsystem (IMS), a technology that is being deployed in many wireline and wireless networks throughout the country, explicitly envisages this sort of pricing as one of the pricing schemes supported by IMS.”...

... And as van Schewick points out, this model is already showing up in European mobile networks, where some networks charge users an extra fee to use internet telephony or to use an e-mail client on their phone....

... For instance, Comcast runs an online video service called FanCast that competes with NetFlix and YouTube, and is trying to buy NBC, which owns more than 30 percent of Hulu.com. And every cable and satellite company offers pay-movie services for an extra monthly fee and a la carte video on demand that compete with third-party streaming video services, like Blockbuster and Amazon....

I love the Orwellian twist of calling a cable-company business model venture "Openet".

Google and Apple will never be best buds again, but the vision of a net run like cable TV has concentrated their minds. At the moment then, though betrayals are certain, we have Google, Apple, Netflix, Amazon and even Facebook on one side. On the other side we have Verizon, AT&T and Comcast. Microsoft, the wounded Titan, lurks in the background, perhaps contemplating an acquisition.

As a consumer and citizen, there's no doubt which side I support. On a scale of corporate evil, AT&T & Verizon are far above Google and Apple (Facebook is another matter). Politically Google and Apple are pretty much on the Dem side, and Verizon, AT&T and Comcast are very much GOP.

Should be interesting, and scary.

Saturday, December 18, 2010

Hans Rosling's Ted Talk trendanalyzer is now Google Docs Motion Chart

It's not new, but it's still awesome to see Hans Rosling demo the Trendanalyzer wealth/mortality graph. It's a great antidote to Rationalist fatigue; we have made awesome progress in 200 years. Measured in wealth and life expectancy, the hellholes of the modern world are still better than most of the europe of 1800.

Check out the mortality spikes at WW I/Spanish Flu and WW II, and the fall of South Africa.

The real trick is getting the data, but the software is still cool enough. Must be awesome to have something like that. Must cost a fortune though.

Except, it's free. My friend Rob M. pointed me to Motion Chart, Google's rebranding of the Trendanalyzer. It comes with Google Docs. You can make some pretty nice graphics for a blog post from it.

Yes, we've come a long way.

Research the past 200 years of memetic propagation: Google Books ngram viewer

Google Ngram Viewer. Awesome.

The frequency of the word "dementia" spiked in the period 1870-1885. It then rose gradually to 1920 and fell until 1962 or so. Since then it has skyrocketed.

"Hepatitis" was used (Upper case "H", searches are case sensitive at the moment) around 1818, but then it disappeared until 1940. I wonder what it meant in 1818; it wouldn't be the modern meaning of the word.

"Schizophrenia" does not appear at all before 1910.

Screen shot 2010-12-18 at 8.50.09 PM.png

Slide rule starts rising in 1880, spikes in 1940, and then fall smoothly to plateau around 1990.

Screen shot 2010-12-18 at 8.52.42 PM.png

The usage patterns of the "n-word" is remarkable -- there are peaks in 1860 (the war to preserve slavery), around 1940 (black Americans in the armed forces), 1970 (civil rights), and the late 90s (the meaning and usage of the n-word changes?).

Imagine how a teacher could use this.