HTML and the Innovator's Dilemma
July 29, 2007
Depending on your point of view, the following prediction will either seem incredibly obvious or incredibly stupid. Oh well. Here it is:
HTML with Javascript is going to become the GUI technology of choice, killing off "rich client" and desktop apps written in languages such as C, C++, Java and C#.
Okay, before you click away in disgust at my obviously stupid (or stupidly obvious) statement, let me back it up. This isn't a result of limited experience or some personal agenda. Some of my biggest customers are creating rich client applications. As Monty Python would say, "Not dead yet!" Furthermore, HTML+Javascript is an expensive technology, plagued with cross-browser issues and unable to provide the responsiveness and interactivity of a desktop application.
But the writing's on the wall, and the collapse of desktop apps could take vendors that specialize in them down too. Here's why.
Disruptive Technologies
In 1997, Harpers-Collins published a book by Clayton Christensen titled The Innovator's Dilemma. It detailed the inevitable march of disruptive technologies and their role in causing well-managed companies to fail.
From the name, you might think that disruptive technologies make a big splash, stunning everybody with their brilliance and allowing nimble, smaller companies to run rings around the lumbering old guard. Actually, that's not true at all. Big, well-managed companies have smart engineers and they're completely capable of taking advantage of innovations. In fact, they often create the disruptive technologies. Frequently, they're also driven out of business by them.
You see, a key element of disruptive technologies is that they initially provide worse product performance than existing technologies. In other words, to rational observers already established with existing technologies, the disruptive technology is, well, crap. It does less and costs more.
Of course, if the technology is so bad, why would anybody buy it? Actually, no one does... at least, not in the existing markets. After all, it's crap. Established companies ignore it. It's the small, starving startups that take the technology to market... and not any existing market, because the performance is so bad. They actually create new markets for the technology, using it to provide something no previous technology can provide. Then they just percolate along, doing their own thing, largely ignored by the big guys.
To look at an example, hard disk drives have gone through many generations of disruptive change. It looks like they're currently going through another. Five years ago, solid-state drives (flash memory) were outrageously expensive compared to hard disk drives. They both provide non-volatile storage, but there was no way flash memory could compete in the desktop market. The makers of flash memory had to find a new market, and digital cameras ended up being a perfect fit. (Does anybody remember when digital cameras used magnetic tape, floppy disks, and CD-ROM's? They did!)
Sound familiar yet?
Five years ago, HTML+Javascript was outrageously expensive compared to desktop apps. They both provide GUI interfaces, but there was no way HTML+Javascript could compete in the desktop market. Companies using this technology had to find a new market, and web applications ended up being a perfect fit. (Does anybody remember when web applications were entirely server-side, with no client-side interactivity? They were!)
The March of Progress
Okay, this is all fine. New technologies come along and create new markets. Sure--that makes sense. What makes them disruptive? Why would this cause existing companies to fail?
The other key element of disruptive technologies is that their technological progress outstrips market demand. In other words, the technology gets better faster than the market's needs. Furthermore, the disruptive technology offers something that the existing technology cannot, such as simplicity, reliability, or convenience.
As existing and disruptive technologies both improve, the existing technology outpaces its market's needs, while the disruptive technology begins to satisfy them. At first, the disruptive technology takes over the low end of the market. The established technology flees to the higher-margin, higher-profit high-end... sometimes gladly. This is often a time of record profits for established companies. However, as technology continues to improve, the disruptive technology takes over more and more of the market, squeezing out the established technology and eventually squashing it entirely.
Let's look at how this happened with 3.5-inch hard drives*. These drives are ubiquitous now, but at one time, they were a disruptive technology. They were first developed in 1984 by Rodime, but they had few sales. Seagate was a dominant manufacturer, and their engineers developed a 3.5-inch drive in 1985. At the time, new desktop computers used 5.25-inch drives in 40 or 60-megabyte capacities. Seagate's 3.5-inch drive was limited to 20 megabytes and cost more than the 5.25-inch drives. Seagate's customers didn't like them and Seagate executives cancelled the program.
*This anecdote is adapted from Christensen.
Two years later, in 1987, upstart Conner started selling 3.5-inch drives to Compaq for use in portable computers, a new market that 5.25-inch drives couldn't satisfy. By 1988, the drives were good enough to use in desktop computers. By 1989, more than half of all hard drive sales were 3.5-inch drives. Today, 5.25-inch drives are nowhere to be found on the desktop. They've been replaced with 3.5-inch drives.
In retrospect, the transition from 5.25-inch drives to 3.5-inch drives may seem obvious and necessary. But think about from the perspective of the time--why did a desktop tower, with plenty of space, need a smaller, more expensive drive? It didn't. With a higher cost per megabyte, there was no reason to develop the technology. It took a new company in a new market--Conner, selling to the portable computer market--to nurture the technology. Eventually the drive reached sizes that were attractive to desktop computer users and their greater reliability provided a reason to switch.
Why Disruptive Technologies Win
Although disruptive technologies start out providing inferior performance, they gradually improve in capability. They gradually grow to provide performance that the mainstream market desires--first at the low end, but eventually at the high end as well.
Why would anybody switch from the old technology to the new one? Once the disruptive technology has matured, it not only satisfies the market's needs, it does so in a way that's simpler, cheaper, more reliable, or more convenient than the existing technology.
As a result, once the disruptive technology reaches a certain level of performance, the changeover can be swift and dramatic. Three and a half-inch drives were invented in 1984. The desktop market ignored them until 1988... and then, just one year later, more 3.5-inch drives were sold than 5.25-inch drives.
Although Seagate continues to survive, many established companies fail when disruptive technologies take over their market. The reasons are difficult to summarize--read Christensen for details--but these failures are essentially the result of companies being trapped by the expectations of their existing customers, suppliers, and employees. Even when executives recognize the threat of a disruptive technology, their company is at the center of a complex web of skills and relationships that they can rarely escape. Rather than embrace the disruptive technology, they flee upmarket, only to be pursued by relentlessly improving technology.
HTML+Javascript is a Disruptive Technology...
Could HTML with Javascript be a disruptive technology? I think it is. To be disruptive, a technology has to fit several criteria. First, at the time of its introduction, it needs to be such a poor performer that it isn't suitable for the mainstream market, but not so poor that it dies entirely. Instead, it must find a brand-new market. HTML+Javascript fits this profile perfectly. As a GUI technology, it couldn't compete with mainstream GUI applications, but it had a solid home in the web.
Second, its capabilities must improve faster than the demands of the market. In the GUI application market, that's not actually that hard. The mainstream market for GUI applications (excluding games) barely changes at all. We're still using the same basic WIMP (windows, icons, menus, and pointer) interface that we always have. Despite our computers' vast improvements in terms of sound and 3D capability, only a few specialized applications use those capabilities.
HTML+Javascript, on the other hand, are rapidly becoming more capable. The server round-trip delay that once followed nearly every click is no longer a given. Apps provide truly instant responses. Google Maps even has real-time drag-and-drop route planning, a feat I never thought possible at HTML+Javascript's current level of sophistication. HTML+Javascript's capabilities are clearly improving faster than the market.
Third, like other disruptive technologies, HTML+Javascript is cheaper and more convenient (for the user) than established technologies. Most HTML+Javascript applications are advertiser-supported or available for a small subscription fee. They provide convenience in the form of zero installation and maintenance and they can be accessed from any computer.
...And It's Already Won
We're still at the early stages of HTML+Javascript's disruption of the mainstream GUI market. The technology is still focused on niche applications that typically aren't found on the desktop, such as Google's map and driving directions. It has started to encroach on the desktop market, though, with its most notable success being email applications. We're also starting to see the first tentative steps towards HTML+Javascript productivity applications, such as Google Docs & Spreadsheets.
The writing is on the wall. Existing GUI technologies have outstripped market demands and HTML+Javascript is rapidly reaching the technological capability required to serve that market. With its added conveniences of location transparency and zero installation, the market will surely transition to HTML+Javascript.
Although it's just barely competing in the mainstream GUI market and it has a long ways to go, the crucial moment for HTML+Javascript has already passed. Sure, HTML+Javascript has notable flaws, but those flaws will be addressed. Developers and users will demand it. HTML with Javascript has won the fight. Now we watch while the pattern of disruptive technologies plays out to its inevitable conclusion.
Further Reading
-
Dan Dodge of Microsoft's Emerging Business Team says, "No Innovator's Dilemma here."
PS...
Isn't the Internet or the Web the disruptive technology? For things like music and movie delivery, I think they are. For GUI software developers, I think the true disruptive technology is HTML+Javascript. The Internet and the Web are a network and a protocol, respectively--sustaining, not disruptive, improvements to technologies that GUI developers already used. Take a look at Microsoft: they have no trouble providing developer support for Internetworking and web servers, but they're having trouble dealing with the disruption of HTML+Javascript. If you want a rich client-side development library for HTML+Javascript, you'll turn to upstarts Google and Yahoo, not Microsoft.
PPS...
I'm smacking myself on the forehead right now. I completely forgot to mention the huge category of applications that has been transitioning to HTML+Javascript for years: internal and vertical-market apps. HTML+Javascript hasn't just won in theory, it's kind of a done deal for many corporate apps.