The Gilder Fallacy

Rich Karlgaard at Forbes has a recurring meme he calls the Cheap Revolution to explain the impact of increasing technological abundance.  He revisits it again in the latest issue of Forbes:

 

“The Cheap Revolution rolls on, making new billionaires even as it collects more old scalps each year. Step back and look at what’s happening in technology. Computation gets twice as fast every 18 months, and at the same price point. Storage evolves faster–every 12 months. Communications is fastest of all, doubling every 9 months.”

 

He is repeating an assumption that underlies an awful lot of conventional wisdom (and investment dollars) today: the idea that network bandwidth is growing relatively faster than computation or storage.  I give George Gilder credit for popularizing this notion in his Microcosm and Telecosm books and various articles.  The only problem is he was wrong when it comes to the real world.  So giving credit where credit is due, we call this the Gilder Fallacy.  Yes, in terms of relative increase in technical capability, communications are outstripping computation and storage.  George can explain in florid prose how ever more photons can be crammed down a strand of glass.  But unlike computation and storage, the customer’s cost of communications over public networks doesn’t mirror the rate of improvement in the lab.  In fact it badly lags the rate of improvement of computation and storage.  The technology improvements don’t get passed on in wide area networks (i.e. the Internet).  Just look at your broadband bill – my guess is it doesn’t halve every nine months.  Nor probably does your bandwidth double every nine months.  Some combination of telco pricing practices, municipal tax policies and last mile issues ensure the savings are seldom passed on to you and me, and certainly not at any rate approaching Moore’s Law.  You have to build your own network to ride the underlying improvement curve.

 

Getting the underlying economics right provides an incredible tailwind.  People who grokked Moore’s Law two decades ago ran their businesses better than those who didn’t.  It helped you understand what was important and what were only temporary obstacles.  We’re moving from a world of scarcity to one of abundance in many areas, but you still need to understand relative abundance.  The Gilder Fallacy leads you to believe computation is the relatively scarce resource therefore you waste bandwidth to optimize on computation.  But the underlying economics dictate the opposite.  Despite being refuted years ago, I am amazed at how many bets in the industry continue to suffer from the Gilder Fallacy.

 

Jim Gray has a brilliant paper on Distributed Computing Economics that delves into the implications – in particular:

 

“Put the computation near the data. The recurrent theme of this analysis is that “On Demand” computing is only economical for very cpu-intensive (100,000 instructions per byte or a cpu-day-per gigabyte of network traffic) applications.”

 

“If telecom prices drop faster than Moore’s law, the analysis fails.  If telecom prices drop slower than Moore’s law, the analysis becomes stronger.   Most of the argument in this paper pivots on the relatively high price of telecommunications.  Over the last 40 years telecom prices have fallen much more slowly than any other information technology.  If this situation changed, it could completely alter the arguments here.   But there is no obvious sign of that occurring.”

 

Again, this applies to public networks.  Build your own network and you can come much closer to harnessing the underlying rate of technology improvement.  This fact also will help perpetuate a distinction between what you might do on a corporate network and what you might do over the Internet.

 

This has big implications for grid computing, On-Demand, “software as a service” and other various industry enthusiasms.  Many of them set sail thinking they had a tailwind when in fact the headwind will only grow fiercer over time. 

 

Not only is the power on the edge, but the edge is getting more powerful on a relative basis.

3 thoughts on “The Gilder Fallacy”

  1. Every now and then Gilder is just massively full of it. It was probably about 9 years ago that I heard him give an interview explaining how before too long there wouldn’t be these big fat applications anymore, everything would be a Java applet. Yes, he actually said this, and I didn’t misunderstand him. Once you get past the basic physical technologies he’s clueless.

Comments are closed.