Showing posts with label tech churn. Show all posts
Showing posts with label tech churn. Show all posts

Friday, June 28, 2013

Blogger.com and Wordpress.com traffic post the end of Google Reader

Inspired by this Rumproarious post on the effects of the Google Reader shutdown announcement of mid-March 2013 (data excludes custom domains hosted by blogger or wordpress).

Alexa: WordPress.com

 
Alexa: Blogger.com

That's ... impressive. I'm still puzzled that WordPress didn't have a stronger response to the Google Reader shutdown. My best guess is that they'd already decided to abandon the Wordpress.com business.

It will be interesting to return to this topic in six months. I'm amazed how many good alternatives have already emerged to Google Reader. I didn't think we'd have so many choices.

See also:

Thursday, April 18, 2013

The Net is a forest. It has fires.

Whatever RSS (Atom, etc) was intended to be, it became the standard plumbing for subscription and notification. When In Our Time has a new podcast available, Google Reader's use of RSS tells me to get it. When Emily adds a new event to her calendar, RSS lets me know about it.

These are useful tools, but most of all RSS is the plumbing that enables Google Reader to track the hundreds of publishing sources I follow. Some of them publish dozens of stories a day, some publish 2-5 times, a day, and some publish every few weeks. RSS and Google Reader means I can follow them all. Without it the NYT would still be interesting -- I'd just visit it less often. I would give up on those infrequent publishers though, even the ones I love.

Many of those infrequent publishers are "amateur" writers who use blogs. RSS is the democratizing force that put them and the New York Times on an equal footing -- much to the NYT's chagrin. RSS is one of the things that makes blogs work -- esp. the blogs I love.

Since RSS has been pretty important to blogs, and since Google Reader has been the dominant RSS client for years, it's worth seeing what the major blog platforms are saying about the end of Reader

We'll start with Blogger. That's a huge platform, they must have had a lot to say ...

<crickets>

Ok. That's weird. Let's take a look at another biggie - Tumblr, home of 100 million blogs.

<single cricket>

Wow. Spooky. Ok, let's go to the real core. The home of WordPress, the world's dominant professional blogging platform...

<intergalactic space>

As Bond says "Once is happenstance. Twice is coincidence. The third time it's enemy action."

Something is happening. It feels like a fire is coming to the Net. Again.

The first fire I remember was the end of Usenet. Yeah, I know it's technically still running, but it's a faint shadow of the days when I posted about Mosaic for Windows in WinOS2. The Usenet archive nearly vanished when DejaNews failed, but Google rescued it. That was a different Google that the one we know now.

The next fire took out GeoCities. GeoCities was once the third most valuable property on the Net; thirty-eight million web pages died when Yahoo closed it. (Did you know Lycos.com is still around and that it still hosts Tripod? I was shocked.)

Yes, maybe 90% of those pages were junk, but that leaves about 4 million pages of people writing about things they were passionate about. Apple's termination of MobileMe .mac web sharing destroyed a much smaller amount of content, but even now I come across reference to great .Mac content that's gone. Not just moved somewhere else, gone.

The end of GeoCities and .Mac was matched by the end of applications like FrontPage and iWeb. Those apps let geeky amateur's publish to their (web) "hosting" services. Most of that content is lost now -- millions of pages.

No wonder it's hard to find things I read on the net in the 90s. The fires took it all.

Today its feels like the fire is coming again, and once again amateur content will be purged.

I wonder if it will return again.

Saturday, March 16, 2013

Project Ducky - why I've stopped using new Cloud services.

Dilbert 04/14/1994:

Screen Shot 2013 03 16 at 3 39 53 PM

Yeah, Google Reader is on many geek minds today, but it's not the only cloud death to disrupt my routine.  Today, working in Aperture, I tried searching on a AppleScript workaround for Aperture's single-window mangled Project problems. I found a couple of good references -- to Apple .Mac pages that died with MobileMe. The page owners never recreated their lost resources.

Later I wanted to upload some photos from our special hockey team. I remembered then that Google discontinued Mac/Picasa integration and the iPhoto Plug-in.

Within a year I expect Google is going to discontinue Blogger, which currently hosts this blog.

Enough.

I'm on strike. You want my business? Give me standards. Give me products I pay for that have low exit costs and that have competitors. 

Oh, yeah, Google - go away.

See also:

Wednesday, July 04, 2012

Computing 2012: The End of all Empires

I grew up in a bipolar world.

Yes, the USSR vs. USA, but also the bipolar world of Microsoft and Apple. One was ruthless and ruled by corporate power, the other was a stylish tyranny.

Times changed. The USSR fell apart leaving a Russian mafia state ruled by a mobster, and the USA fell into a spiral of fear, wealth concentration, political corruption, and institutional failure. China grew wealthy, but turned into a fascist state run by oligarchs and mobsters. The EU has Greece and Italy and the second Great Depression. India, Brazil, everyone has problems, nobody is a secure Power. Now we live in a multipolar world.

Weirdly, the same thing has happened to the world of computing (now including phones). Microsoft's slow collapse is this week's Vanity Fair special. Google joined the Sith and all it got was dorkware, a human-free social network, and a profit-free phone. Post-IPO Facebook is rich and frail looking. Dell, HP, Motorola, RIM and Nokia are history.

Ahh, but what about Apple? Isn't Apple going from power to power -- even in the old Mac/Windows wars?

That's how it looks - to the press. Today. But I'm just coming off an epic 1 week fiasco involving OS X Lion and iCloud. It ended with me deciding to keep my primary machine on Snow Leopard and reverting my iPhone to iTunes sync after years of MobileMe sync. I'll try again when Mountain Lion is out.

Yes, few people will run into the problems I have had (arising at least in part from an obscure geeky bug with OS X/Unix vs Windows "line termination"). Many people, however, will run into some problems. My experience shows that many months after Apple's grandiose iCloud launch and insane MobileMe/iCloud migration, they still don't have troubleshooting tools and procedures or, amazingly, any way to delete your iCloud.com data. It's as though they thought they'd get everything right the first time -- perhaps because everyone associated with MobileMe was purged.

That's a hell of a miss for a corporation with billions in the bank and a fifteen year history of bungling online services.

Then there's the Apple ID/FairPlay/iCloud problems. My friends are struggling with these. Other friends can't figure out how to manage Ringtones on iTunes.

Perhaps most worrisome of all, Apple is providing mega-compensation packages to its corporate executives because, apparently, they must be retained. An unavoidable step with inevitable consequences. Bad consequences.

Apple doesn't look strong to me. It looks vulnerable.

Google, Apple, Microsoft, Facebook. None of them are serving me well. None of them are looking all that strong.

All the Empires are falling. My personal balancing act is becoming more complex all the time.

Tuesday, May 29, 2012

Apple's quality problem is a complexity problem too

Marco Arment, builder of a respected OS X app, writes
Three Things That Should Trouble Apple – Marco.org
... Apple’s software quality is declining.
I’m not just talking about the most recent releases of everything, or the last couple of months — I’ve noticed this trend for about 2–3 years. As Apple’s software has grown to address larger feature sets, hard-to-solve problems such as sync and online services, shorter release cycles, increasingly strong competition, and Apple’s own immense scale, quality has slipped...
I've noticed it for at least that long, but OS X Lion is a particular disappointment.

Siri is a recent example. After a promising start Siri died. I'll pin this one on Cook. Jobs had his flops, but he usually turned off the marketing until the bodies were buried or the problems were fixed.

The bigger quality problem though - that one came from Jobs. Some of it had to be the talent distraction of iOS development, but Lion is full of bad choices. Whoever decided to change how files were saved instead of focusing on quality and reliability should find another job.

Then there's the MobileMe to iCloud migration. Was there really no way to incrementally fix and extend MobileMe?

It is true that the endless malware race forces developers into disruptive and often imperfectly tested software updates. Microsoft faces this problem too however, and I think they're doing better with it. Apple chose to inflict much of the combinatorial complexity of interactions between iOS and OS X devices, synchronization across disparate data models -- not to mention the crazed, DRM-driven metastatic Apple ID Hell family/payer/owner identity and authentication problem. Apple chose to focus on marketing rather than customer value when they created their Aperture/iPhoto mess - and Apple continues to market Aperture as a smooth upgrade for iPhoto despite that mess. Rather like Siri, come to think of it.

Apple's quality problems are far from under control, and I don't think Cooks's executive compensation plans are helping.

Thursday, February 23, 2012

Interesting problems: MobileMe, iCloud, Lions and Google

These are interesting times for our home IT strategy. Recent Apple changes are accelerating the demise of the general purpose computer in ways I'm only beginning to understand.

At our home, for example, we have been using MobileMe to manage Contact synchronization between iOS devices (iPhones w and w/o SIMs) and our four Macs. Each of the Macs has multiple user accounts to serve some subset of our family of five. (I'm omitting the Google Apps aspects to simplify - things are complex enough).

This has to change, MobileMe is going away June 30th 2012. It is being replaced by iCloud. iCloud works with most of our iOS devices, but it requires Lion (Mountain Lion soon) on the Macs.

Of our four Macs, only two will definitely run Mountain Lion. [1].

Meanwhile, Google's Anakin Skywalker emulation means we want to back away from our dependency on them, which leaves, unfortunately, even more dependency on the Other Devil.

I think, going forward, the transition over the next year is to:

  • A server (iMac) and 1-2 laptops
  • iPads - eventually one for each child ($$)
  • iCloud (yech)

I'm glum, but I don't see how I fight this. The old G5 iMac with the decaying (delaminating?) display may go to anyone interested in a machine that can run MacOS Classic. The Dual USB laptop will stay Snow Leopard and be a kid's homework machine.

[1] Lion is such a mess I've been restricting it to a single device that came with it. It's unlikely that Mountain Lion will work on our Core 2 Duo plastic body MacBook with integrated Intel graphics, though there are rumors that Apple may add some coverage. It appears even Apple is embarrassed by Lion, and may feel the need to bury it rather than let it quickly.

Update 2/25/12: ironically, one reaction to Apple's Snow Leopard/MobileMe/MacBook triple termination is to turn to the other agent of the DarkSeid - Google. i used Spanning Sync ($25/year) with some success for several years to sync Address Book to Google. We already use Google Apps Calendar for our family including our iPhone calendars. So one solution is to go Google across all our Macs, even on Mountain Lion.

Sunday, February 12, 2012

My iPhoto miscalculation - whitewater world

iPhoto is dying.

That much is clear. iPhoto 11's launch bug problems followed the pattern of the past decade. Unlike past releases though, iPhoto 11 lost important capabilities -- just like iMovie and Final Cut Pro X regressed from prior versions.

That's bad, but what's worse is that, after seven years of sort-of trying, Aperture is still not an adequate iPhoto replacement.

Bad on bad news, but the real sign of a dying platform is the echoing silence. When users stop complaining, a software platform is dead.

Fortunately I had planned from this the very beginning. I knew, nine years ago, I was taking a big risk putting my photos and data into an Apple product. Even then Apple had a reputation -- it didn't worry much about customer data. I figured Apple might abandon iPhoto, but I also figured the Mac community would come up with a migration solution.

I was wrong. There's no migration to Lightroom, there is no exit from iPhoto that preserves data.

Where did I go wrong? I missed this ...

Of funerals, digital photos and impermanence — Tech News and Analysis

... Apps like Instagram and Path, both of which I love, actually make this problem worse instead of better in some ways. They are great for sharing quick snapshots of a place you are visiting or someone you are with or what you are eating — and you can share those easily to Flickr and Facebook and Tumblr and lots of other platforms (more than 26 photos are uploaded to Instagram every second). But do you want to save all of these for a lifetime, along with the ones you took of your new baby or your sister’s wedding? Probably not. So again, there is a filtering problem....

I didn't imagine that geeks would basically give up; overwhelmed by rapidly changing technologies. I didn't anticipate that the 'prosumer' computer platforms would die instead of reforming. I didn't imagine that OS X would go into maintenance mode. I didn't imagine a technology regression of this magnitude.

I expected rough waters, I didn't expect whitewater.

See also:

Saturday, January 07, 2012

Divorcing Google - and hoping for GoogleMinus

Google hired a hit man, and was shocked to find bodies. Now the remnants of Google 1.0 are punishing Google 2.0, though Matt Cutts hasn't said anything. He didn't used to be so quiet.

Google 1.0's glorious Data Liberation Front is pretty quiet too. Their Twitter feed went silent on 9/15/2011.

It's sad. I loved Google from my first searches in 1997 until the Google-Apple war began in 2009. Even then I wanted to believe in their original mission to free the world's knowledge. I didn't really lose faith until Google deleted my Reader Share memory - with 1 week's notice.

Yeah, I was denying simple arithmetic. Google's is an algorithmic corporation that iterates on its goals, and its goals are to delivery value to their paying customers. Advertisers. Yes, Google, like health insurance corporations, has an Agency problem.

Notice how much junk their is in Google searches these days? How many "plus.google.com" pages? How ads are growing across Google's search pages? This isn't going to stop -- not unless Google divides into two companies.

Now I'm moving off the Google platform. It's a slow and painful process. Just as painful as moving from DOS to Mac Classic, from Mac Classic to Windows, Windows to OS/2, OS/2 to Windows 2K, Windows to OS X, Palm to iOS…

Especially like moving from Palm to iOS. That's because there was a wasteland between the end of PalmOS and the rise of iOS. It was a kind of technological winter; a gap between one life and the other. I nursed my aging Palm devices along because the alternative was to resurrect my Franklin Planner.

That's what life is like post-Google - because there isn't a good alternative to Google. iCloud? Please. Microsoft? I wish. Yahoo!? Ok, you get the point. I won't go on.

So it's a tough divorce. It would be even harder if I were an Android customer. I wonder if Android users understand how tightly they're tied to Google -- and why.

How will this winter end? Amazon? Apple? Microsoft?!?

Or … perhaps …. GoogleMinus.

GoogleMinus, because there are businesses in Google that make money selling value to users. Google Docs, for example, sells ad-free solutions to schools and corporations. These aren't not huge businesses, but they might be profitable if they could lease infrastructure from GooglePlus.

GoogleMinus, because there are still, I think, some idealists left at Google -- and they might prefer to work for GoogleMinus rather than GooglePlus.

It could happen. The EU might require a breakup. Civil strife within Google might make a breakup internally acceptable.

Imagine a future when GoogleMinus packages GooglePlus search. For an annual fee of $100 I get the ability to block plus.google.com searches, and a control that lets me filter out web sites that run ads. Wouldn't that be interesting?

Update 1/9/12: This Android retrospective is relevant. I'd forgotten the day Google made its deal with Verizon; a deal signed in blood at midnight

Update 1/10/12: Today Google dedicated their search function to promoting Google+ properties.

Saturday, November 26, 2011

Mass disability goes mainstream: disequilibria and RCIIT

I've been chattering for a few years about the rise of mass disability and the role of RCIIT (India, China, computers, networks) in the Lesser Depression. This has taken me a bit out of the Krugman camp, which means I'm probably wrong.

Yes, I accept Krugman's thesis that the proximal cause of depression is a collapse in demand combined with the zero-bound problem. Hard to argue with arithmetic.

I think there's more going on though. Some secular trends that will be with us even if followed Krugman's wise advice. In fact, under the surface, I suspect Krugman and DeLong believe this as well. I've read Krugman for years, and he was once more worried about the impact of globalization and IT than he's now willing to admit. Sometimes he has to simplify.

For example, fraud has always been with us -- but something happened to make traditional fraud for more effective over the past thirteen years. I think that "something" was the rise of information technology and associated complexity; a technology that allowed financiers to appear to be contributing value even though their primary role was parasitic.

Similarly, the rise of China and India is, in the long run, good for the entire world. In the near future, however, it's very hard for world economies to adjust. Income shifts to a tiny fraction of Americans, many jobs are disrupted, people have to move, to change careers, etc. It takes time for new tax structures to be accepted, for new work to emerge. IT has the same disruptive effect. AI and communication networks will further limit the jobs we can take where our economic returns are equal or greater than the minimum wage.

I think these ideas are starting to get traction. Today Herman Gans is writing in the NYT about the age of the superfluous worker. A few days ago The Economist reviewed a book about disequlibria and IT

Economics Focus: Marathon machine | The Economist

... Erik Brynjolfsson, an economist, and Andrew McAfee, a technology expert, argue in their new e-book, “Race Against the Machine”, that too much innovation is the bane of struggling workers. Progress in information and communication technology (ICT) may be occurring too fast for labour markets to keep up. Such a revolution ought to be obvious enough to dissuade others from writing about stagnation. But Messrs Brynjolfsson and McAfee argue that because the growth is exponential, it is deceptive in its pace...

... Progress in many areas of ICT follows Moore’s law, they write, which suggests that circuit performance should double every 1-2 years. In the early years of the ICT revolution, during the flat part of the exponential curve, progress seemed interesting but limited in its applications. As doublings accumulate, however, and technology moves into the steep part of the exponential curve, great leaps become possible. Technological feats such as self-driving cars and voice-recognition and translation programmes, not long ago a distant hope, are now realities. Further progress may generate profound economic change, they say. ICT is a “general purpose technology”, like steam-power or electrification, able to affect businesses in all industries...

... There will also be growing pains. Technology allows firms to offshore back-office tasks, for instance, or replace cashiers with automated kiosks. Powerful new systems may threaten the jobs of those who felt safe from technology. Pattern-recognition software is used to do work previously accomplished by teams of lawyers. Programmes can do a passable job writing up baseball games, and may soon fill parts of newspaper sections (those not sunk by free online competition). Workers are displaced, but businesses are proving slow to find new uses for the labour made available. Those left unemployed or underemployed are struggling to retrain and catch up with the new economy’s needs.

As a result, the labour force is polarising. Many of those once employed as semi-skilled workers are now fighting for low-wage jobs. Change has been good for those at the very top. Whereas real wages have been falling or flat for most workers, they have increased for those who have advanced degrees. Owners of capital have also benefited. They have enjoyed big gains from the increased returns on investments in equipment. Technology is allowing the best performers in many fields, such as superstar entertainers, to dominate global markets, crowding out those even slightly less skilled. And technology has yet to cut costs for health care, or education. Much of the rich world’s workforce has been squeezed on two sides, by stagnant wages and rising costs.

In time the economy will adjust  -- unless exponential IT transformation actually continues [1]. Alas, the AI revolution well is underway and technology cycles are still brutally short.  I don't see adjustment happening within the next six years. The whitewater isn't calming.

[1] That is, of course, the Singularity premise, as previously reviewed in The Economist.

Update 12/3/2011: And how does the great stagnation play into this - Gordon's Notes: Ants, corporations and the great stagnation?

Friday, November 18, 2011

Social media is so 2000

GigaOm has a longish cloud computing post around a Peachtree Capital Advisors investor survey (full report is by request only).

I usually don't pay much attention to consulting group reports like this, but there were a few comments that struck me as interesting....

VCs: Don’t mistake cloud computing for cloud opportunity — Cloud Computing News

... tech investors are underwhelmed by social computing: A whopping 88 percent characterized the social media segment (including collaboration) as overvalued....

... The whole big data explosion that most businesses are trying to capture depends on the wide availability of diverse data from many sources, including the so-called Bermuda Triangle of Facebook, Twitter and Google...

... 35 percent of those surveyed said they think enterprise software as a category is undervalued...

By enterprise software they presumably mean Microsoft, Oracle, SAP, etc.

I was struck by the declining interest in social media. That may be because investors figure it's a mature segment (!) and Facebook owns it. Or that consumers are (re)turning to Cable TV.

I think both may be true. When a sclerotic company like Google 2.0 jumps into a domain, you can be pretty sure it's yesterday's news. Consumer tech cycles are viciously short now; fashion designers understand this all too well.

On the other hand, I'm also impressed by how quiet Facebook, G+ and the rest feel now, and "how happy this man looks" (SplatF). By my estimate we're in year 12 of the long depression, and we have years to go. Cable TV has not been displaced, and if consumers have limited time and attention ...

Saturday, November 12, 2011

A good time to invest in old bicycles?

As all my friends know, I'm one of those annoyingly cheerful Pollyannas, nothing like that Kassandra fellow we all ignore [1].

So I liked Jay Goltz's NYT blog post on the case for optimism ...

...  things have slowly been getting better. In 2011, I hired about five additional people. And I really hired them. No 1099 contract workers, no temporary workers, no part time...

... seeing an increase in the amount of furniture people are buying, partially because houses have been selling again and people are moving again. Large real estate projects in the corporate world that have been on hold are being completed, and art is being bought for the walls. And my picture-framing business has started to see customers who come in with art they say they have been meaning to frame...

... With the exception of things like restaurant meals and car washes, many purchases can be put off only so long. Eventually, they have to happen. Roofs, air conditioning units, clothes, cars and even dental care will be bought. In my business, I have been buying new equipment –  trucks, computers — and taking care of maintenance that had been avoided the previous couple of years. I have talked to four car dealers who say they are very busy, as well many other business owners from roofing contractors to a large carpeting business. Almost all say things are better and that they believe pent-up demand is one reason...

This is how balance sheet or even liquidity trap recessions are supposed to end. It may take a very long time, but eventually people spend. Or wars happen and governments spend (oops, that wasn't so optimisitic).

There are some countervailing sentiments however. Europe is doing a slow motion version of the Crash of '09. Maybe we'll get to see how it plays out without massive governmental intervention [2].

Meanwhile, perhaps related to the slow motion train wreck of European finance, Google is cutting back on its projects. Adobe just shut down its decade-long investments in Flash, Flex, and Air. Olympus is collapsing because it can no longer conceal losses from 17 years ago -- and nobody believes Olympus is the only Japanese, or US, company with falsified accounts. ATT is squeezing customers hard. Apple's quality problems continue.

In a development that goes largely unnoticed, corporations are taking a "destroy the village to save it" approach to information security. The diversion of corporate wealth to elite compensation continues, with effects that are poorly understood.

Lastly, our whitewater world is no less frothy, complexity attacks are still ubiquitous and virtually unnoticed - and the AIs are getting smarter [3]. If you're a 'structuralist', you'd say that the Great Disruptors are still working on the world order.

And there's the "China bubble" (334,000 Google hits today).

So is this a good time to invest in proven bicycles, long lasting antimicrobials, and garden tools?

Well, bicycles are always a good idea, but I suspect what lies ahead is, as usual, a lot like what lies behind.

Somethings are improving. Other things are worsening. So the US will see some trendline improvement with periodic disruptions -- and we'll be lucky to do that well.

[1] We all know, of course, that the curse of Cassandra was that she would be always right and always ignored.
[2] It is comically ironic that the "marketarian" leaning US government should be able to intervene and the European Government cannot. Oh, wait, that's right. Europe doesn't have a government .... 
[3] Meanwhile quantum computing is looking more real every day. Not that that will be disruptive.

Friday, October 21, 2011

Google Reader: This is going to hurt

Let's get the good news out of the way.

Google has stepped back from their Buzz/G+ nymwar policy. Google will support pseudonyms. So they listened after all.

More good news. As promised, Google Reader lives.

That's enough of the good news. Don't want to overdose.

The bad news is that Google will be ripping out a lot of GR in favor of G+, even though G+ lacks a mechanism for subscribing to aspects of a person's stream. Much of the functionality I love, such as the feed for GR shares, the web page created from GR shares and notes, the ability to follow my trusted curators shares -- it's all at risk. In my case, tens of thousands of annotations, a vast amount of Cloud data, is at risk. Reeder.app, by far my most heavily used iPhone app, is at risk.

The worst news is that Google is giving us 1 week's warning. It's almost as though they want to get this over with before they get a nymwar level of feedback.

Happily, we bereaved GR users are not alone. There are 357 comments on Alan Green's G+ announcement, and the last few hundred are a tad ... unhappy. Please feel free to add your comments one way or another.

My primary comment is that Google needs to stop and think - carefully. Sure, there aren't many GR power users. What we lack in numbers, however, we more than make up in geekery. We are uber-geeks and/or journalists, and we have a long memory. Apple can blow away data, but we don't mind. We never trusted them with our data. Google though, Google's not Apple. We expect different failures from Google.

There's a tsunami of hurt building in the obscure little GR community. We may be small Google, but we're rabid little buggers. E.D. Kain, Sarah Perez, Skeptic Geek, Jesse Stay, Incidental Economist, Martin Steiger, Brett Keller, me... We're coming out of the woodwork.

There is a right way to fix GR. That would be to clean up and fill out the current feature set, and replace Reader's dead Buzz functionality with similar G+ functionality. Offer us the option to share via G+ in addition to GR -- assuming G+ gets its interest streams working.

Google's making the same kind of mistake they made with the nymwars. That one they're fixing. Maybe they'll fix this one too.

So we're gonna yell. One week isn't much, but it may be enough time to get Alan Steel and his colleagues to put the brakes on. Stop, then think.

Update: My companion G+ stream post (restricted).

Update 10/21/11: There's a petition expressing user concerns about Google's plan.

Sunday, September 04, 2011

Google Quick, Sick and Dead - 6th edition.

It's been only four months since the 5th edition of Google Quick, Sick and Dead - 5th edition. It's been a busy time though, with the launch of G+ and Google recently announcing another set of official closures. The terminations were of products I thought had already been discontinued, so I don't have them listed below.

As with prior editions this is a review of the Google Services I use personally - so Android is not on the list. Items that have moved up or are new are green, items that have moved down or officially discontinued are red, in parens is the prior state.

For me personally the news is not good -- both Google Reader and Google Custom Search are now on the Dead list (though Google has finally fixed the broken icon that was displaying with custom search). These are two of my favorite Google services, but neither of them deliver significant ad revenue to Google. That, in a nutshell, is the problem with relying on Google's cloud. G+ is mildly interesting, but so far it's not doing anything useful for me.

The Quick (Q)
  • Google Scholar (Q)
  • Gmail (Q)
  • Chrome browser (Q)
  • Picasa Web Albums (Q)
  • Calendar (Q)
  • Maps and Earth (Q)
  • News (Q)
  • Google Docs (Q)
  • Google Voice (Q)
  • Google Search (Q)
  • Google (Gmail) Tasks (Q)
  • YouTube (Q)
  • Google Apps (Q)
  • Google Profile (Q)
  • Google Contacts (Q)
  • GooglePlus - G+ (new)
  • Blogger (S)
The Sick (S)
  • Google’s Data Liberation Front (S)
  • Google Translate (S)
  • Books (S)
  • Google Mobile Sync (S)
  • Google Checkout (S)
  • iGoogle (S)
The Walking Dead (D)
  • Buzz (D)
  • Google Groups (D)
  • Google Sites (D)
  • Knol (D)
  • Firefox/IE toolbars (D)
  • Google Talk (D)
  • Google Parental Controls (D)
  • Google Reader (S)
  • Orkut (S)
  • Custom search engines (S)
  • Google Video Chat (S) - replaced by G+ Hangout
See also:

Saturday, July 02, 2011

The state of blogging - dead or alive?

Today one of the quality bloggers I read declared blogging is dying. Two weeks ago, Brent Simmons, an early sub/pub (RSS, Atom) adopter tacked the RSS is dead meme. Today I discovered Google Plus Circles don't have readable feeds.

Perhaps worst of all, Google Reader, one of Google's best apps, is getting no Plus love at all -- and nobody seems upset. The only reference I could find shows in an Amil Dash post...

The Sparks feature, like a topic-based feed reader for keyword search results, is the least developed part of the site so far. Google Reader is so good, this can't possibly stay so bad for too long ...

That's a lot of crepe. It's not new however. I've been reading about the death of blogging for at least five years.

Against that I was so impressed with a recent blog post that I yesterday raved about terrific quality of the blogs I read.

So what's going on? I think Brent Simmons has the best state-of-the-art review. I say that because, of course, he lines up pretty well with my own opinions. (Brent has a bit more credibility I admit).

This is what I think is happening ...

  • We all hate the word Blog. Geeks should not name things.
  • The people I read are compulsive communicators. Brad, Charlie, Felix, Paul and many less famous names. They can't stop. Krugman is the most influential columnist in the US, but he's not paid for his non-stop NYT blog. Even when he declares he'll be absolutely offline he still posts.
  • Subscription and notification is absolutely not going away. Whether it's "RSS" (which is now a label for a variety of subscription technology standards) or Facebook's internal proprietary system there will be a form of sub/pub/notify. There are lots of interesting sub/notification projects starting up.
  • Nobody has been able to monetize the RSS/Atom/Feed infrastructure. Partial posts that redirect to ad-laden sites rarely work. (A few have figured out how to do this, but it's tricky.)
  • Blogs have enemies with significant economic and political power. That has an opportunity cost for developers of pub/sub solutions and it removes a potential source of innovation and communication.
  • Normal humans (aka civilians) do not use dedicated feed readers. That was a bridge too far. They don't use Twitter either btw and are really struggling with email.
  • Even for geeks, standalone feed readers on the desktop were killed by Google Reader. Standalone readers do persist on intermittently disconnected devices (aka smartphones).
  • Blog comments have failed miserably. The original backlink model, was killed by spam. (Bits of Google Reader Share and Buzz point the way to making this work, but Google seems to be unable to figure this out.)
  • The quality of what I read is, if anything, improving. i can't comment on overall volume, since I don't care about that. I have enough to read. It is true that some of my favorites go quiet for a while, but they often return.

Short version - it's a murky mixed bag. The good news is that pub/sub/notify is not going away, and that compulsive communicators will write even if they have to pay for the privilege. The bad news is that we're probably in for some turbulent transitions towards a world where someone can monetize the infostream.

Friday, January 28, 2011

Turbulent world, rigid software

One of the fringe benefits of playing professor is that I can insert my idiosyncratic observations into relatively innocent minds.

Last night, during a health informatics lecture, I described the remarkable rigidities in an intersecting set of vertical software systems I know well. Some of the applications are older than the younger students, others are just maturing. They're all strung together by a rickety set of interfaces and interdependencies; even routine data configuration is problematic. It's an interlocking and rigid system of brittleware. When business conditions change, brittleware breaks.

That's not unusual. We see it even on solitary desktop applications. PowerPoint 2007 is clearly senile; it needs a long cruise on a railing-free ship. Brittleware is everywhere.

Problem is, the world changes. Of course that's not new; the 20th century was packed with change. For most of that time, however, we used people and paper. People and paper are relatively easy to change. Even hardware is easy to change. Software though, software is hard.

So what impact does rigid software have on the ability of businesses to adopt to changing conditions? Does it become a true impediment to adaptation?

Sunday, January 02, 2011

Resolution 2011: Managing complexity

I'm good with resolutions. Mostly because I know how to pick 'em. I make 'em doable.

Consider sleep. I like to exercise, but in my life sleep is more important. So I've resolved to sleep at least 52 hours a week [1]. I think I can do that if I track the numbers and identify where I fall short.

That's one for 2011. The other resolution is about managing technological complexity.

I've been on a complexity reduction kick for a few years , but this year my focus is on technological complexity. I'm starting with the plausible assumption that we all have a personal "complexity budget". Some of us can manage more complexity, others less, but we all have memory and processor limits -- even the AIs among us.

We can spend our complexity capacity figuring out how to adjust product development to available capacity, or we can spend it figuring out what parts of SharePoint are worth using [2]. Both tasks will be equally draining.

At some points in my life I had complexity capacity to spare - perhaps because I wasn't using it wisely. That's not true now.  Gains from improved productivity techniques [3] and growth of mind [4] are offset by entropic neurons. Most of all though, my life overflows. I'm not complaining -- it's an overflow of good stuff. It means though, that I need to use my complexity capacity wisely.  I can't be spending limited firepower figuring out which of my 15 Google identities is running a feedburner bot linked to a pseudonymous twitter account.

It's not easy to reduce technological complexity. It often means making do with less; giving up tools and solutions I really like. Often it means declining new incrementally better improvements -- because a 10% gain isn't worth the learning curve and failure risk. Sometimes it means giving up on old tools that still work but are increasingly unsupported. Yes, it's a lot like software portfolio management.

Looking at how technological complexity grows in my life I can identify four broad causes....

  1. Taking on too many simultaneous tools and solution sets.
  2. Failure to clean up. Ex: Abandoned user identities, google accounts, etc. Creates noise and clutter.
  3. Premature adoption of technologies and solutions. Ex: Any new OS X release, any new Apple hardware, trying to get Contact synchronization to work with both Google Contacts, OS X Address Book and iPhone, OS Spaces. Above all - Google Wave.
  4. Prolonged use of increasingly unsupported solutions in a world of forced software evolution [6]. Ex: document-centric web tools, wristwatches, printers.

I've gotten better at the first one, but the next three all need work. The 3rd and 4th are particularly tricky. My heavy use of Google's multi-calendar sync solutions is clearly premature [5], but it's been very valuable and relatively bug free. On the other hand, I think my jury-rigged Contact integration solution may be a bridge too far. On the other hand, I stuck with Outlook's Notes feature long after it was clearly dead.

Cleaning up is the least interesting measure, but one of most important. There are 1,575 entries in my credentials database, extending back to August 1995. Sure, most of those sites are long gone, but I still have too many active identities and credentials. I need to gradually cull a few hundred.

This project should keep me busy for a while. It will, of course, suck processors in the short term, but I expect near term returns and long term gains. Feels like a good resolution target.

-- fn --

[1] It helps that recent research suggests that amyloid clearance occurs primarily during sleep, and I'm speculating that a 10-20% decline in amyloid clearance translates to 10 extra years of dementia.
[2] The wiki and, in the absence of alternatives, the document store. Don't touch the rest, even the stuff that seems to work is poison. 
[3] At my stage "GTD" is child's play.  I use a mongrel of Agile development planning methodology, GTD/Franklin, and a pocketful of tricks including calendar integration across family and work.
[4] For quite some time mind can grow even as brain more or less sucks wind. Not forever, but for a time. 
[5] The UI for configuration multiple calendars has been bizarrely obscure for about two years. This is not mainstream.
[6] It's predator-prey stuff. Software evolution was much more leisurely before human-on-human predation took off with hacks, frauds, identity theft, malware and the like. Now old bucks have to keep moving, or we become wolf chow. Software cycles are faster, products die quickly, and we have to keep buying whizzier hardware. If not for malware, the curated world of iOS would still be years away.

See also:

Sunday, December 05, 2010

Why you will live in an iOS world

Five years ago, just before Microsoft Vista was released, our household CIO made a strategic decision. We would move to OS X.

It wasn't a hard decision. The cost of supporting both XP and OS X was too high, XP's security, debugging and maintenance issues were intractable, and OS X had a much more interesting software marketplace. Moving to OS X would dramatically reduce our cost of ownership, which was primarily the CIO's opportunity cost. Time spent managing XP meant less time spent on my health and on family joys and obligations. [6]

It worked beautifully. One of my best strategic decisions. Yes, I curse Apple with the best of them, but I know the alternatives. I'm not going anywhere.

Except I am going somewhere. I will fade. So will you, though there's a bit more hope for the under-30 crowd. We might be able to slow the natural deterioration of the human brain (aka "Alzheimer's" and its relatives [4]) by 2030. It's too late for the boomers though, and probably too late for Gen X.

Sure, I'm still the silverback of the geek tribe. I may have lost a step, but between experience and Google I still crush the tough ones with a single blow.

Not for long though. I give myself ten years at most. I won't be able to manage something like OS X version 20, and I don't want to be reliant on my geek inheritor - son #2.

We will need to simplify. In particular, we'll need to simplify our tech infrastructure (and our finances [1] and online identities [7] too).

So our next migration will be to iOS - a closed, curated, hard target, simpler world.

You'll be going there too -- even if you're not fading (yet). The weight of the Boomers [2] will shift the market to Apple's iOS and its emerging equivalents. Equivalents like ChromeOS, now turning into iOS for desktop device with its own App Store [5].

I still have a few years of OS X left, including, if all goes well, the 11" MacBook Air I've been studying. The household CIO's job, however, is to think strategically. Our future household acquisitions will shift more and more to iOS devices, possibly starting with iPad 2.0 (2011) [3].

I expect by 2018 we'll be living in largely iOS-equivalent world, and so will you.

-- footnotes

[1] I miss Quicken 1996 -- before Intuit went to the DarkSeid.
[2] The 2016 remake of Logan's Run will be a smash hit. 
[3] I bought iPad 1.0 for my 80yo mother -- same reasons.
[4] 1989 was when the National Institutes of Health needed to launch a "Manhattan Project" style dementia-management program. I wasn't the only person to say this at the time. 
[5] If their first netbook device doesn't come in under $150 with batteries Google is in deep trouble. Android is not an iOS-equivalent, it's a lot more like XP. 
[6] Pogue's 10 year tech retrospective is a beautiful summary of the costs of making the wrong household tech decisions. He misses the key point though. The real costs are not the purchase costs, or the immense amount of failed invention, or the landfill costs -- it's the opportunity costs of all the time lost to tech churn. I've a hunch this opportunity cost is important to understanding what happened to the world economy between 1994 and 2010. That's another post though!
[7] Digital identities proliferate like weeds. Do you know where all your identities are?

Wednesday, October 06, 2010

Cricket’s $149 Android and the future $4000 Dell desktop

We are way past the tipping point if the no contract $149 Android phone is real [1]. The replacement for the $150 ChromeOS Netbook has come before the netbook, and Google’s $80 ultra-portable (with FM radio a cell phone too!) is a year ahead of schedule – though Microsoft’s lawsuits will slow things down.

After the lawsuits settle down the contract free low end iPhone will go for $250 in 2012 and Android will hit a billion users by 2013 (including China’s forked Android phone). By then RIM, Windows Mobile and so on will be history. Nokia and Motorola will make Android phones. Microsoft will be an IP parasite, a shadow of its former self.

So what about Dell?

Here’s where it gets funny. I’m used to thinking Dell will go away. After all, even today’s phones can have external monitors and keyboards. Who needs a Dell after 2012?

Well, verticals will. Software development. Servers.

Thing is, vertical gear doesn’t sell for $800 a pop. Remember what Sun workstations cost when Sun was profitable? Desktop prices are going to start going up, and up. By 2013 I expect Dell will sell far fewer machines – but they’ll be much more expensive. One day we will see the $4000 desktop, even as much of Africa carries a supercompter in their pocket.

[1] But what will it cost after the patent suits?

Tuesday, September 21, 2010

How I know Google's Blogger is dying

Nobody but me screams about how bad the new text editor is.

Try this experiment with Safari/Mac and the editor:
  1. Write a post in the rich text editor with paragraphs.
  2. Copy from the HTML view.
  3. Paste into a different post HTML view.
  4. View in Compose (rich text). Note the absence of paragraphs.
That's just the tip of the iceberg.

So what do I do with this blog?

Monday, September 13, 2010

Technological regressions: two examples

Two examples of technological regressions.
  1. Typing. I'm filling out hockey forms. By printing with a pen. Once upon a time I might have typed them. I was a fast typist.
  2. Reliable phone calls. Switched circuit calling was inefficient, but the quality was excellent. Now we have layers of VOIP everywhere -- and it's nowhere near as good as switched circuit. When you add mobile delays to VOIP home phones to VOIP teleconferencing systems you get voice quality from 1940s long distance.
I'm sure there are others ...