Thursday, July 07, 2011

Will the noose close on Rupert Murdoch?

BBC News - News of the World to close amid hacking scandal. Wow.

The people who were hacking into the mobile phones of crime victims worked closely with some of Rupert Murdoch's longtime executives. Some of these executives are closely connected to Cameron, current PM of the UK. Some, no doubt, have connections to figures in right wing US politics. They created the culture that made these crimes praiseworthy. There must be more skeletons.

It will be interesting to see how Murdoch's US tabloid, the "Wall Street Journal" covers this news, and how far their journalists will be allowed to dig. Not as far, I expect, as the NYT's journalists.

Murdoch is going to burn everything he can to keep this at bay. Will his former henchmen squeal?

Update 7/8/11: Sounds like quite a few UK politicians have been hoping for Murdoch to falter. Blood and Treasure is on this story - recommended.

Monday, July 04, 2011

The sorry state of 2011 video editing

Even I have to admit some things have gotten better over the pasts decade. Digital cameras are one. Aperture 3 is another (but iPhoto 11 is a regression).

Video editing though -- it really sucks. Honest - it's awful.

Try searching on "archival video formats". I'll wait ...

Right. There is no agreement. (This discussion is the best I found via Google, I wrote this one in 2008.) Photographers justly consider JPEG and TIFF as suboptimal archival formats -- but we're light years head of videographers.

Next, using iMovie 2008, try to create a decent looking .mp4 movie using Export with Quicktime. Take your time, I'll wait.

This has not gone well. I suspect the root cause are the video and patent wars that infest video technology. I am certain this is not the only domain where America's insane software patents are damaging growth and progress.

See also

Update 7/5/11: The more I look into this, the worse $30 iMovie looks. Paradoxically, the more interesting $300 FCP X becomes.

America and the social safety net - what happens if future growth fails?

My understanding of the financing of social security, and perhaps of medicare, was that we took some of the wealth of the future to make the present better.

This can be a reasonable trade. America of 2030 ought to be much wealthier than America of 2011. Why not share the wealth -- especially as we are borrowing from our future selves just as we gave to our parents.

But what if America stops getting wealthier? Or what if that wealth is concentrated in a small slice of the population, a disproportionately powerful segmented that is disinclined to share its wealth -- and has the power to say no.

Then we have a deep problem with the way we have historically financed our social insurance.

If technological innovation really has slowed ...

The iPhone calendar.app color assignment debacle makes Android look good

There are web sites with reams of news about iOS 5 features.

I'd trade them all for a fix for the iPhone/iPad Calendar.app color assignment problem:

Calendars

Of the 10 calendars currently in my iPhone subscription list (9 Google ActiveSync Calendars, 1 corporate ActiveSync), 6 have been assigned a calendar color of "brown/beige".

iOS doesn't give users control over calendar color assignment, and the algorithm it uses to assign colors is broken even within a single server source. I think it once worked better, but even then color distribution was within a server, not across servers.

So is this fixed with iOS 5?

It seems not. With iOS 5 calendar color assignment works with iCloud/iCal, but there's no change for ActiveSync users. It's a sign that the Apple-Google war never really ended, it just became a grudging, surly, detente.

The costs of switching my family from iOS to Android are extremely high. It would take a lot to drive me down that road. Every time I look at my calendar however ...

Life with Google Two Step Verification - Sign-in Failed with Places.app

Places.app is one of Google's newer iPhone "social" apps. This is what you see if you try to sign in with a Google 2-step verification (two factor) account:

Sigh. It's been 3 months now since I implemented Google's "2-step verification" (technically, "two-channel" verification), and while I still rely on it the process has been painful.

I've had to create so many "app-specific" passwords that I've taken to reusing them. They're not app-specific at all in truth, so now I have about 20-30 "extra" passwords for my one Google account.

Google started out reasonably well on this "beta" effort, but they haven't progressed. Now, with their focus on Google Plus, I'm afraid they're stuck.

At this point, 2-step verification is only for the hardiest of geeks.

See also:

Sunday, July 03, 2011

Why I've dropped Scientific American's news feed

Scientific American has run  a pretty aggressive paywall for years. Even so, the SciAm news feed was readable.

Was being the operative word. Lately too many of the posts are incomplete excerpts from articles that are behind their paywall.

Today they pushed me over the edge. They're gone.

Greed has its risks.

Saturday, July 02, 2011

NYT's 1982 article on how teletext would transform America

(with thanks to Joseph P for the cite).

There were familiar computing names in the 1980s - Apple, IBM and so on. There were also many now lost, such as Atari and Commodore PCs. There were networks and email and decades old sophisticated collaboration technologies now almost lost to memory.

Against that background the Institute for the Future tried to predict the IT landscape of 1998. They were looking 16 years ahead.

You can see how well they did. For reasons I'll explain, the italicized text are word substitutions. Emphases mine ...

STUDY SAYS TECHNOLOGY COULD TRANSFORM SOCIETY (June 13, 1982)

WASHINGTON, June 13— A report ... made public today speculates that by the end of this century electronic information technology will have transformed American home, business, manufacturing, school, family and political life.

The report suggests that one-way and two-way home information systems ... will penetrate deeply into daily life, with an effect on society as profound as those of the automobile and commercial television earlier in this century.

It conjured a vision, at once appealing and threatening, of a style of life defined and controlled by network terminals throughout the house.

As a consequence, the report envisioned this kind of American home by the year 1998: ''Family life is not limited to meals, weekend outings, and once a-year vacations. Instead of being the glue that holds things together so that family members can do all those other things they're expected to do - like work, school, and community gatherings -the family is the unit that does those other things, and the home is the place where they get done. Like the term 'cottage industry,' this view might seem to reflect a previous era when family trades were passed down from generation to generation, and children apprenticed to their parents. In the 'electronic cottage,' however, one electronic 'tool kit' can support many information production trades.''...

... The report warned that the new technology would raise difficult issues of privacy and control that will have to be addressed soon to ''maximize its benefits and minimize its threats to society.''

The study ... was an attempt at the risky business of ''technology assessment,'' peering into the future of an electronic world.

The study focused on the emerging videotex industry, formed by the marriage of two older technologies, communications and computing. It estimated that 40 percent of American households will have internet service by the end of the century. By comparison, it took television 16 years to penetrate 90 percent of households from the time commercial service was begun.

The ''key driving force'' controlling the speed of computer communications penetration, the report said, is the extent to which advertisers can be persuaded to use it, reducing the cost of the service to subscribers.

''Networked systems create opportunities for individuals to exercise much greater choice over the information available to them,'' the researchers wrote. ''Individuals may be able to use network systems to create their own newspapers, design their own curricula, compile their own consumer guides.

''On the other hand, because of the complexity and sophistication of these systems, they create new dangers of manipulation or social engineering, either for political or economic gain. Similarly, at the same time that these systems will bring a greatly increased flow of information and services into the home, they will also carry a stream of information out of the home about the preferences and behavior of its occupants.'' Social Side Effects

The report stressed what it called ''transformative effects'' of the new technology, the largely unintended and unanticipated social side effects. ''Television, for example, was developed to provide entertainment for mass audiences but the extent of its social and psychological side effects on children and adults was never planned for,'' the report said. ''The mass-produced automobile has impacted on city design, allocation of recreation time, environmental policy, and the design of hospital emergency room facilities.''

Such effects, it added, were likely to become apparent in home and family life, in the consumer marketplace, in the business office and in politics.

Widespread penetration of the technology, it said, would mean, among other things, these developments:

- The home will double as a place of employment, with men and women conducting much of their work at the computer terminal. This will affect both the architecture and location of the home. It will also blur the distinction between places of residence and places of business, with uncertain effects on zoning, travel patterns and neighborhoods.

- Home-based shopping will permit consumers to control manufacturing directly, ordering exactly what they need for ''production on demand.''

- There will be a shift away from conventional workplace and school socialization. Friends, peer groups and alliances will be determined electronically, creating classes of people based on interests and skills rather than age and social class.

- A new profession of information ''brokers'' and ''managers'' will emerge, serving as ''gatekeepers,'' monitoring politicians and corporations and selectively releasing information to interested parties.

- The ''extended family'' might be recreated if the elderly can support themselves through electronic homework, making them more desirable to have around.

... The blurring of lines between home and work, the report stated, will raise difficult issues, such as working hours. The new technology, it suggested, may force the development of a new kind of business leader. ''Managing the complicated communication in networks between office and home may require very different styles than current managers exhibit,'' the report concluded.

The study also predicted a much greater diversity in the American political power structure. ''Electronic networks might mean the end of the two party system, as networks of voters band together to support a variety of slates - maybe hundreds of them,'' it said.

Now read this article on using software bots (not robots, contrary to the title) to shape and control social networks and opinions and two recent posts of mine on the state of blogging.

So, did the Institute for the Future get it right - or not?

I would say they did quite well, though they are more right about 2011 than about 1998. I didn't think so at first, because they used words like "videotext" and "teletext". They sound silly because we still do very little with telepresence or videoconferencing -- contrary to the expectations of the last seventy years.

On careful reading though, it was clear what they called "teletext and videotext" was approximately "email and rich media communications". So I substituted the words "computer", "internet" and "networked systems" where appropriate. Otherwise I just bolded a few key phrases.

Rereading it now they got quite a bit right. They weren't even that far off on home penetration.  They also got quite a bit wrong. The impact on politics seems to have contributed to polarization rather than diversity. Even now few elders use computer systems to interact with grandchildren, and none did in 1998.

So, overall, they maybe 65% right, but about 10 years premature (on a 16 year timeline!). That's now awful for predicting the near future, but they'd do even better to follow Charle's Stross prediction rules ...

The near-future is comprised of three parts: 90% of it is just like the present, 9% is new but foreseeable developments and innovations, and 1% is utterly bizarre and unexpected.

(Oh, and we're living in 2001's near future, just like 2001 was the near future of 1991. It's a recursive function, in other words.)

However, sometimes bits of the present go away. Ask yourself when you last used a slide rule — or a pocket calculator, as opposed to the calculator app on your phone or laptop, let alone trig tables. That's a technological example. Cultural aspects die off over time, as well. And I'm currently pondering what it is that people aren't afraid of any more. Like witchcraft, or imminent thermonuclear annihilation....

Why Apple's thunderbolt cables have inline computers ...

Coverage of Apple's $50 thunderbolt "active cable" focuses on performance advantages  ...

The technology inside Apple's $50 Thunderbolt cable

.... Apple didn't respond to our requests for further information about the "firmware in the cable," but an EETimes article from earlier this year noted that in addition to having different electrical characteristics from Mini DisplayPort, Thunderbolt also uses active cabling to achieve full duplex 10Gbps transmission...

Maybe. I'm skeptical though. I suspect it's all about the DRM.

The state of blogging - dead or alive?

Today one of the quality bloggers I read declared blogging is dying. Two weeks ago, Brent Simmons, an early sub/pub (RSS, Atom) adopter tacked the RSS is dead meme. Today I discovered Google Plus Circles don't have readable feeds.

Perhaps worst of all, Google Reader, one of Google's best apps, is getting no Plus love at all -- and nobody seems upset. The only reference I could find shows in an Amil Dash post...

The Sparks feature, like a topic-based feed reader for keyword search results, is the least developed part of the site so far. Google Reader is so good, this can't possibly stay so bad for too long ...

That's a lot of crepe. It's not new however. I've been reading about the death of blogging for at least five years.

Against that I was so impressed with a recent blog post that I yesterday raved about terrific quality of the blogs I read.

So what's going on? I think Brent Simmons has the best state-of-the-art review. I say that because, of course, he lines up pretty well with my own opinions. (Brent has a bit more credibility I admit).

This is what I think is happening ...

  • We all hate the word Blog. Geeks should not name things.
  • The people I read are compulsive communicators. Brad, Charlie, Felix, Paul and many less famous names. They can't stop. Krugman is the most influential columnist in the US, but he's not paid for his non-stop NYT blog. Even when he declares he'll be absolutely offline he still posts.
  • Subscription and notification is absolutely not going away. Whether it's "RSS" (which is now a label for a variety of subscription technology standards) or Facebook's internal proprietary system there will be a form of sub/pub/notify. There are lots of interesting sub/notification projects starting up.
  • Nobody has been able to monetize the RSS/Atom/Feed infrastructure. Partial posts that redirect to ad-laden sites rarely work. (A few have figured out how to do this, but it's tricky.)
  • Blogs have enemies with significant economic and political power. That has an opportunity cost for developers of pub/sub solutions and it removes a potential source of innovation and communication.
  • Normal humans (aka civilians) do not use dedicated feed readers. That was a bridge too far. They don't use Twitter either btw and are really struggling with email.
  • Even for geeks, standalone feed readers on the desktop were killed by Google Reader. Standalone readers do persist on intermittently disconnected devices (aka smartphones).
  • Blog comments have failed miserably. The original backlink model, was killed by spam. (Bits of Google Reader Share and Buzz point the way to making this work, but Google seems to be unable to figure this out.)
  • The quality of what I read is, if anything, improving. i can't comment on overall volume, since I don't care about that. I have enough to read. It is true that some of my favorites go quiet for a while, but they often return.

Short version - it's a murky mixed bag. The good news is that pub/sub/notify is not going away, and that compulsive communicators will write even if they have to pay for the privilege. The bad news is that we're probably in for some turbulent transitions towards a world where someone can monetize the infostream.

Friday, July 01, 2011

The terrible advantage of the blogosphere

Sometimes we get stuck with the most awful words. Blog, blogging, blogger, blogosphere. Hate 'em.

I hate the name, but I love the medium. Today's example comes from a Blood & Treasure link to a Granite Studio post on the 90th anniversary of the Chinese communist party as told through the Mad Men TV show. Brilliant, caustic, funny, insightful, educational and all about modern China - writing doesn't get better than this.

All free. These sites don't even have ads.

On the other hand, we have The Atlantic, a magazine fairly recently edited by another superb blogger - James Fallows. I love James writing, and I read the blogs of many of The Atlantic's writers, but the magazine is mostly silly. This month's lead article on how we're ruining children by raising their self-esteem was so excruciatingly idiotic it single-handedly killed my next subscription renewal (still time for a turnaround James).

How can this be? The writers for the The Atlantic are pros -- even the worst of them.

It's volume. There are millions of bloggers producing thousands of posts. I've read a mere 230,000 or so, and shared perhaps 25,000. Even if only one in ten thousand is excellent, my network of readers and bloggers will find and expose it. I don't care than 99.999% are drivel -- because I don't see those. I see the one in 10,000, and of those I read less than 1 in 10.

No magazine can compete.

Update 7/2/11: Corrections in italics and strikeout! My apologies James. I thought you were still editor. A reader informed me that you are a correspondent again. I plain forgot. Maybe that explains the recent 'alternative medicine' and 'spoiled child' articles. Makes it easier for us to donate money to my favorite blogger rather than renew.

Thursday, June 30, 2011

Stross whiffs on the Singularity

Charlie Stross has been heads down writing for a while, but he must have his books in the bag because his blog is aflame again.

Naturally, knowing we crave red raw meat, he started with an attack on geek theology. He beat up on the Singularity.

Go read the essay. Here's my quick digest of his three arguments:

  1. We won't build super-intelligent sentient computers because .... well ... we just won't ... because .... we're not that stupid and it wouldn't serve any obvious purpose.
  2. Uploading consciousnesses won't work because we didn't evolve to be uploaded and religious sorts will object.
  3. We aren't living in a Simulation because ... well, we might be ... but it's not falsifiable so ...

Charlie! What happened? This is your most muddled essay in years.

Not to worry too much though. Charlie followed up with three excellent posts. I think he was just rusty.

See also:

PS. Where am I on all things Skynet? I think we'll create artificial sentience and it will be the end of us. Unlike Charlie, I think there will be great economic advantages to push the limits of AI towards sentience, and we won't resist that. I'm very much hoping that is still 80 years away, but I'm afraid I might see it before I die. I think brain uploading is a hopeless dream. As for us living in a Simulation -- it does explain the Fermi Paradox ...

Civil War, Polar edition

The good news is that corporations may not be as powerful as I thought they were.

The bad news is that we're reenacting a Cold version of the American Civil War ...

To the Limit - Krugman  NYTimes.com

... Last December, after Mr. Obama agreed to extend the Bush tax cuts — a move that many people, myself included, viewed as in effect a concession to Republican blackmail — Marc Ambinder of The Atlantic asked why the deal hadn’t included a rise in the debt limit, so as to forestall another hostage situation (my words, not Mr. Ambinder’s).

The president’s response seemed clueless even then. He asserted that “nobody, Democrat or Republican, is willing to see the full faith and credit of the United States government collapse,” and that he was sure that John Boehner, as speaker of the House, would accept his “responsibilities to govern.”

Well, we’ve seen how that worked out...

... G.O.P. leaders don’t actually care about the level of debt. Instead, they’re using the threat of a debt crisis to impose an ideological agenda. If you had any doubt about that, last week’s tantrum should have convinced you. Democrats engaged in debt negotiations argued that since we’re supposedly in dire fiscal straits, we should talk about limiting tax breaks for corporate jets and hedge-fund managers as well as slashing aid to the poor and unlucky. And Republicans, in response, walked out of the talks.

So what’s really going on is extortion pure and simple. As Mike Konczal of the Roosevelt Institute puts it, the G.O.P. has, in effect, come around with baseball bats and declared, “Nice economy you have here. A real shame if something happened to it.”

And the reason Republicans are doing this is because they must believe that it will work: Mr. Obama caved in over tax cuts, and they expect him to cave again. They believe that they have the upper hand, because the public will blame the president for the economic crisis they’re threatening to create. In fact, it’s hard to avoid the suspicion that G.O.P. leaders actually want the economy to perform badly.

Republicans believe, in short, that they’ve got Mr. Obama’s number, that he may still live in the White House but that for practical purposes his presidency is already over. It’s time — indeed, long past time — for him to prove them wrong.

The GOP has the support of about half the US. The Dems have the other half; at the moment, the smarter half.

Obama offered the GOP what, a few months ago, they said they wanted. Now they reject it. They will destroy America to save it.

The last president from Illinois compromised and compromised -- until he couldn't.

This time I'm hoping for a Cold version of a national conflict.

I need to find out how to short US Treasuries.

See also

Denialists in favor of a warmer earth

Talking Points attended a denialist gathering and brought back this poster photo ....

Climatechange8wide

What caught my attention is that the flat earthers no longer deny that the earth is warming. Their new member-mandatory beliefs are:

  • Global warming isn't due to human activity
  • A warmer earth is a good thing

This is a slightly more interesting flavor of nonsense. Even though these beliefs spring from tribal identity rather than science, they can be evaluated in a scientific framework.

The first thesis is the weakest. If anything, solar output may be transiently declining. The sun certainly does not appear to be increasing its temperature output. If anything natural variation is mitigating human warming.

The second is more interesting. A slow warming of the earth would shift ecosystems. Tropical animals, like low population pre-industrial humans, would probably benefit from a mild, slowish, warming.  Rapid warming in a world of 8+ billion post-industrial humans is another matter entirely. I can't see that going well, though some countries (Oh, Canada) may do better than others (China, coal is not your friend).

We may yet envy the polar bears.

See also:

I think I have a Google Reader problem ...

From Google Reader trends: "Since October 7, 2005 you have read a total of 232,935 items."

RIM and rebooting a failed software company

The iPhone was introduced on January 9th 2007. I was shocked.

It went to consumers in mid-2007. Naturally the share price of RIM (Blackberry) responded ....

Screen shot 2011 06 30 at 6 23 22 PM

Responded,that is, by going up. Way up.

A year later the App Store was online and, with iOS 2, the iPhone was truly useable. Most importantly, Apple licensed Microsoft's ActiveSync. The iPhone could now get corporate calendars and email.

That's when the market finally saw the light and RIMM slid, but, then astoundingly, it mostly recovered!

No wonder RIM's CEOs could delude themselves into thinking they still had a business. The market was a delusional as they were. It wasn't just the market and a couple of overpriced executives. Pundits continued to talk about RIM as though they were a serious contender against Android and iOS.

I thought RIM was finished. It didn't matter how much money they were making. They had a first class horse drawn carriage, but Apple was selling a BMW sedan for a trifle more. It was insane to imagine that RIM, owners of a 1980s PalmOS-style vintage platform, could possibly compete. Now it's obviously over and the big guys will fight for RIM's patents.

It's an interesting story -- because the market lagged reality by so much. Why was it so irrational? I'm sure we can make up post-hoc explanations. Maybe the market assumed Microsoft would buy them, and only gave up when Ballmer "acquired" Nokia (hope that goes better than Skype will go).

Myself, I suspect the market is frequently irrational. Every exec had a BB, and because of the way RIM licensed RIM server access it was a real power symbol. Plebes carried flip phones, executives carried BBs. When plebes started carrying iPhones execs couldn't wrap their heads around the power inversion. To sell RIM was to admit their personal power token had gone the way of the typewriter.

RIMM has one more lesson to teach us. Today a remarkably hopeful but anonymous RIM employee published a roadmap for RIM's recovery. It will be familiar reading to anyone who's ever been a part of a dying software enterprise. It came with 8 recommendations - but four of them caught my attention.

  1. Focus on the users, not the buyers.
    RIM sold their phones to carriers. Sprint, AT&T, Verizon and the like. We know them well --  we hate them, they hate us. Selling to the buyer is standard business practice, but it's also a trap. RIM fell into that trap, Apple, astoundingly, did not.
  2. Have senior executives that live and breathe the making of software.
    Most IT businesses are run by MBAs and generic executives. People who know business, but don't know the product. RIM went that way.
  3. Focus.
    Cut everything but the core. Hang the cost of cutting. Do it.
  4. Focus on the ecosystem.
    In this case, developers. Sometimes this is a consultant network.

RIM's marketing department responded to the letter. They are so, so dead.