Utility Computing or Futility Computing?

Perhaps the network is not in fact the computer.  The Register reports that after 14 months Sun still doesn’t have any customers for their $1/CPU/hour computing service.

 

It is easy to pick on Sun, but they’ve made the same fundamental error that everyone touting utility, grid, on demand and other flavors of buzzword computing have made: the economics just don’t work in the wide area.  Sun seems to get the economics wrong more often than most (e.g. buying a tape company just as disk has become cheaper than tape), but they all suffer from the mainframe conceit that it is better to centralize processing even if that means the processing ends up far away from the task at hand.  That was once a good idea but not any more.  Timesharing has lost some of its pizazz.  Processing is relatively abundant (see Jim Gray’s Distributed Computing Economics paper again).  You better be doing a lot of processing to make it worth the round trip.  This is why it makes sense to interact with remote, high value services as opposed to shipping things out to be processed elsewhere.  The software behind the service actually does something useful and is close to its own data.  In the meantime there is more and more power on the edge of the network to consume and remix those services.

 

Sun says "In the long run, all computing will be done this way."  Keynes says we’re all dead in the long run.  Depending on your choice of strategy, the long run may come to pass sooner rather than later.