Open Sources 2.0/Open Source: Competition and Evolution/Open Source and the Commoditization of Software

From WikiContent

< Open Sources 2.0 | Open Source: Competition and Evolution
Revision as of 19:06, 5 May 2008 by Docbook2Wiki (Talk)
(diff) ←Older revision | Current revision (diff) | Newer revision→ (diff)
Jump to: navigation, search
Open Sources 2.0

Ian Murdock

It is said that the only things certain in life are death and taxes. For those of us in the IT industry, we can add one more to the list: commoditization. The question is, how do we deal with it, particularly if we are IT vendors and not simply IT consumers, for whom commoditization is an unquestionably positive event?

Commoditization is something that happens to every successful industry eventually—success attracts attention, and there are always competitors willing to offer lower prices to compensate for lesser-known brands or "good enough" quality, as well as customers to whom price means more than brand, quality, or anything else the high-end providers have to offer.

Often, to remain competitive at lower price points, the low-end provider employs a strategy of imitation—for example, investing less in research and development than its high-end peers, and instead relying on the high-end providers to "fight it out" and establish standards and best practices it can then imitate in its own products.

This strategy works because success also breeds interoperability. Unless a company monopolizes a market (a temporary condition, given today's antitrust laws), an industry eventually coalesces around a series of de facto standards that govern how competing products work with each other, or how consumers interact with like products from different vendors. In other words, given time and a large enough market, every industry naturally develops its own lingua franca.

This kind of natural standardization is good for consumers and for the world as a whole. Few people, for example, would know how to type if every typewriter used a different layout for its keys, and the telephone wouldn't be in widespread use today if each carrier's network couldn't talk to any of its competitors' networks. And where would we be today without the descendants of typewriters and telephones—namely, computer keyboards and telecommunications?

Of course, from any incumbent's point of view, an ideal world would allow, say, the market leader in typewriters to own the layout of its product's keys, so anyone who learned to type using its product would face huge barriers to switching to a competitor's product. Fortunately, the layout of a typewriter's keys and similar interoperability features are very difficult proprietary positions to enforce, so once a standard way of interoperating emerges, all vendors are free to imitate that standard in their own products.

The moral of the story is that standardization, and thus commoditization, are both natural market forces as well as key events in human history. When an industry matures and competing products become more or less interchangeable commodities, this allows new industries to build atop them to create new and innovative products that would not have otherwise been possible if the industries they built upon had not standardized. In the case of typewriters and telephones, it is clear that the industries they enabled—the computer industry, e-commerce, etc.—greatly exceed the size of the industries that enabled them, both economically and in their contribution to human progress.

So, how do incumbent firms fight commoditization? Another moral of the story is that they shouldn't. The forces of commoditization, being natural market forces, cannot be beaten. Yet time and time again, incumbent firms fight them. First, the challengers are ignored or dismissed as cheap knockoffs, unsuitable for any but the least-demanding customer. Then they are ridiculed for lacking imagination and innovation. Then, invariably, they are imitated—but by this point, it is too late, as the market has fundamentally changed, and the incumbent finds itself unable to compete because the challengers were built for a commodity market and the incumbent was not. In very simple terms, this is Clayton Christensen's Innovator's Dilemma at work.

This chapter argues that the open source movement is just another commoditization event and that, like other commoditization events, it represents a disruptive shift in the software industry as well as an opportunity for entrant firms to unseat the established firms against seemingly overwhelming odds. That being said, commoditization does not equate to certain death to the established firms if they have the vision to see beyond the disruptive events that may befall them in the short term and can adapt themselves to the new commodity environment. Above all, this chapter aims to convey that commoditization is a natural and unstoppable force that is good for everyone involved—if that force is allowed to develop on its natural course.

Contents

Commoditization and the IT Industry

The computer industry managed to escape the forces of commoditization for the first 20 years or so of its life—a natural occurrence given the industry was young enough and small enough that standards had not yet had the opportunity to emerge. In the first two decades of the industry, computer manufacturers delivered an end-to-end solution to the customer, from the hardware on up through the operating system software that ran the hardware to the applications that ran on top of the operating system. Every layer of the stack—and, most importantly, the interfaces between them—was proprietary to the computer vendor. As a result, every computer spoke a different "language," and it was difficult to get different types of computers to "talk to each other" and interoperate.

Because of these incompatibilities, the initial choice of hardware implicitly tied the buyer to an operating system; in turn, the operating system dictated what applications the buyer would be able to use. Over time, the high cost of computing technology made it financially impractical for the buyer to move away from the incumbent vendor because previous investments in that vendor's technology would have to be discarded. The combination caused users to become "locked in" to a single vendor.

However, as the industry matured, the dynamic changed. Entrant firms such as Apple, Apollo, and Sun saw the opportunity to create products that targeted an entirely new class of computing consumer—the individual user—that could not afford the mainframes and minicomputers sold by established firms such as IBM, DEC, and Data General.

By focusing on "good-enough" quality and lower prices, and by tapping into years of consumer frustration caused by batch processing, timesharing, and incompatibility between proprietary stacks, the new "personal computing" products were received enthusiastically and began to appear in offices and dens everywhere.

The strategies employed by one entrant firm in particular and one established firm in particular would forever change the computer industry. The latter, ironically, would lead directly to the commoditization of the hardware industry, and the former would lead directly to the ongoing commoditization of the software industry.

On the hardware side, IBM sought to stem the rising tide of Apple by introducing its own personal computing product, the IBM PC. Because of internal cost structures designed around multimillion-dollar mainframe products as well as an aggressive product launch timeline, IBM decided to use off-the-shelf parts for the IBM PC instead of following its traditional approach of developing proprietary components in house.

On the software side, Sun sought to attain a competitive advantage against the proprietary stacks of the mainframe and minicomputer vendors by basing its workstation products on the Unix operating system. Unix was already hugely popular in academia and corporate research labs, so this approach gave Sun instant access to a large portfolio of compatible applications as well as an enormous user base already familiar with the operating system that shipped on its products.

In other words, Unix was an open system—that is, a system based on open standards. Unix variants from different groups (for example, AT&T Unix and BSD Unix, the two variants in widespread use in the early 1980s) were largely based on the same APIs. Because of this, applications could be easily ported from one version of Unix to another, and users familiar with one version of Unix could easily learn to operate a different version.

Decommoditization: The Failure of Open Systems

The impact of Sun's decision was the first to be felt. Open systems quickly became popular because of the compatibility they offered—a completely foreign notion at the time. Users adopted systems based on open standards because doing so allowed them to move freely among products from different vendors, avoiding the lock-in common in the proprietary world. Soon, numerous companies—including some of the mainframe and minicomputer vendors—launched Unix-based workstations to compete with Sun's, and Unix became big business.

As the Unix market grew, the competition for customers became fierce. "Compatibility among products," which helped the Unix vendors win converts from the proprietary world, changed from an asset to a liability. In an attempt to imitate the lock-in strategies that had served the mainframe vendors so well for so many years, the Unix vendors themselves began adding incompatible features to their respective products. This ultimately fragmented the market and alienated customers. By the late 1980s, Unix was no longer a lingua franca for the workstation market, but a veritable tower of Babel.

Meanwhile, IBM's decision to use off-the-shelf parts in the IBM PC inadvertently created the industry's first open hardware platform. It was not long before a new wave of entrants, such as Compaq, Dell, and Gateway, realized they could build products that were 100% compatible with the IBM PC, thus gaining access to a large base of applications and users, much as Sun had done by adopting Unix. On the component side, two companies experienced the biggest windfall from IBM's decision: Intel and Microsoft. As the clone market emerged, both companies found an entire market to sell to, not just a single company—a much larger opportunity, even if that single company was IBM.

At this point, the events set in motion by IBM and Sun intersected. As the Unix vendors were competing vigorously with each other through the introduction of proprietary extensions to Unix, thereby "decommoditizing" the lowest level of the software stack, the fully commoditized PC waited in the wings. As PCs became more powerful, they began to replace workstations, and as PCs continued their march upmarket, the market power of the PC vendors (and, thus, the vendors of their constituent components) increased dramatically. In particular, the new ubiquity of the PC helped Microsoft's Windows operating system replace Unix as the lingua franca of not just the new PC-based workstation market, but also of the entire computer industry.

Why did Unix fail while the PC has succeeded beyond anyone's wildest expectations, particularly those of its progenitor, IBM? Both began life as open systems—as ecosystems of sorts—and both grew enormously popular because of their open nature. On the Unix side, though, each vendor tried to own the ecosystem by itself, and, in the end, all they collectively managed to do was destroy it. Meanwhile, on the PC side, the ecosystem won out in the end, for the betterment of all who embraced that ecosystem; and, most importantly, the existence of that ecosystem enabled the creation of other ecosystems above it. For example, without a truly open platform in every office and den, the Internet would not have been able to take root, and it too evolved into an ecosystem that has spawned countless products, services, industries, and ecosystems that were previously unimaginable.

Linux: A Response from the Trenches

It was into this environment that Linux emerged in the early 1990s. At first the mere hobby project of a young college student, Linux captured the imagination of those who could best be described as the "collateral damage" of the Unix wars. Two features of Linux made it appeal to this large group of users and developers: its compatibility with Unix, with which they were intimately familiar; and that it was licensed under the GNU General Public License (GPL), which not only allowed the scores of Unix refugees to contribute to its development, but also guaranteed that Unix-style fragmentation could never happen to the result of the community's work, at least at the source-code level.

Linux grew by leaps and bounds during the 1990s. As with previous challengers, it was first ignored, then ridiculed, by the incumbents, primarily Microsoft, which had masterfully used its position as the de facto standard operating system to expand into numerous additional markets and gain additional—even unprecedented—market power. Unlike so many companies that had come before it, Microsoft wielded the forces of commoditization expertly. By offering its products at lower prices than its competitors could afford to offer them, Microsoft preemptively commoditized many of the markets in which it competed, depending on high volume to make its products profitable and making it impossible for challengers to undercut it.

As Microsoft's power grew, so did the desire of Microsoft's competitors to counter it. By the late 1990s, it was clear that Linux was a powerful force, and many of the industry's largest companies began to see it as a competitive weapon. These companies also recognized that the power behind Linux wasn't so much its technology as its licensing and development model, by now referred to as "open source"—and in particular, the open source model's ability to "out-commoditize" Microsoft.

The fundamental question is this: why is Linux (and the open source movement it helped launch) able to out-commoditize Microsoft? Because it, like the PC, the Internet, and the other open systems and open standards we take for granted today, is more of an ecosystem than a technology. Indeed, Linux builds above those previous ecosystems—without open, commoditized hardware, and without the Internet to enable the open source development model to work, Linux would not exist today.

Microsoft may wield the forces of commoditization more expertly than any company that has come before it, but its platform is not an ecosystem. By definition, an ecosystem is an environment to be shared, not owned. Linux is positioned to become the lingua franca of the lowest level of the software stack, if we never forget it is an ecosystem and not a product to be owned. Looking at the lessons of the past, if it remains an ecosystem, we all win. If not, we destroy it.

"So, How Do You Make Money from Free Software?"

If the open source movement represents the commoditization of software, how can the challengers of today's software industry utilize its commoditizing power to unseat the incumbents, Microsoft in particular? Perhaps more importantly, if this strategy succeeds, is there money to be made in a software industry that has been commoditized? Finally, are there lessons that can be applied from past commoditization events, particularly the events that reshaped the hardware industry in the 1980s?

For a textbook example of how to turn the commoditization of an industry into business advantage, one need look no further than Dell Computer. Dell, of course, was one of the companies that started life in the mid-1980s to build IBM clones. Dell's initial claim to fame was "build to order," taking advantage of the fact that a PC was not really a product in itself, but rather, an assemblage of numerous products that any individual with a moderate amount of skill could assemble himself—a direct lineage from IBM's decision to base the original IBM PC on off-the-shelf parts.

Unlike some of its competitors, Dell saw itself for what it truly was: an assembler of off-the-shelf components and a distributor of these components in a form its customers found useful—namely, a complete PC. Dell gave its customers choice—not an overwhelming amount of choice, but enough choice to give those with the skill to build their own PCs reason to buy from Dell instead of building themselves. Its competitors, on the other hand, attempted to mold the PC into a monolithic, unchangeable product, a collection of specific components from specific vendors with the occasional bit of proprietary technology added to the mix—a thinly veiled attempt to decommoditize the PC standard and own it all to themselves.

To accommodate its new approach to selling hardware, Dell had to develop a new kind of business model. Over the years, the Dell model became more about the assembly of product than the final product of that assembly process. Dell became remarkably good at assembling components from a multitude of suppliers into cohesive wholes, and in negotiating with those suppliers to get the lowest possible price. It stuck to commodity components, allowing the market to pick winning technologies and resisting the temptation to invest heavily in the R&D required to play the proprietary lock-in game its competitors were playing. It employed unusual tactics on the sales side, most notably selling directly to the consumer instead of going through wholesalers and resellers, each of whom took a substantial slice of the profit margin.

As a result of its streamlined processes and lower cost structure, Dell was able to sell PCs at a much lower price than its competitors could. As the PC market grew, and as the market commoditized further with each failed proprietary extension to the PC standard, Dell's position grew stronger. As the PC began to move upmarket, it simply became less expensive to "outsource" the assembly of the PCs to a supplier that specialized in assembling them, and Dell was extremely well positioned to play this new role. Today, as the commoditization of PCs extends to other parts of the hardware market—servers, storage, printers, handheld devices—Dell continues to be extremely well-positioned, and its entry into a new market is often taken as impending doom for that market's established firms.

The First Business Models for Linux

So, what lessons can we learn from Dell as open source commoditizes the software world? Namely, that operating in a commodity market calls for entirely different business models than the business models that have preceded them. Beyond general conclusions such as this, what specific lessons are there to be learned from Dell's success? As a start, we will look at the lowest layer of the software stack, the operating system, and attempt to draw parallels between Dell's successful strategy and the strategies of today's open source operating system vendors—namely, the Linux distribution companies.

To millions of users around the world, "Linux" is an operating system. They're right, of course, but the reality is far more complex than that. First of all, Linux proper is just the kernel, or core, of the operating system—the rest of the software that comprises the "Linux operating system" is developed independently from the kernel, by different groups that often have different release schedules, motivations, and goals.[1]

Traditional operating systems are built by cohesive teams, carefully coordinated groups of product managers, project managers, and programmers at companies and universities. In contrast, Linux is built by thousands of individuals—hackers and hobbyists and professional programmers—some paid to work on specific projects but the majority simply working on what interests them. And the reality is even more involved: Linux is not just a single system, but hundreds of subsystems, programs, and applications, themselves developed by their own communities of individuals around the world.

So, who glues all this mishmash together into something that actually looks like an operating system? Since almost the inception of the Linux community, this has been the job of the "Linux distribution," a curious term in itself for those coming from broader computing circles accustomed to operating systems being built by cohesive teams, or at least teams of cohesive teams.

A Linux distribution is a collection of software (typically free or open source software) combined with the Linux kernel to form a complete operating system. The first distributions (HJ Lu's boot/root diskettes, MCC Interim) were very small affairs, designed simply to help bootstrap the core of a Linux system, on which the user (typically a Linux hacker himself, eager to get into writing some code) could compile the rest of the system by hand and as needed.

A second generation emerged (SLS, Slackware, Debian) that aimed to expand the breadth and depth of software shipped by the first-generation distributions, including software typical end users of Unix systems might find useful, such as the X Window System and document formatting systems. In addition, the second-generation distributions attempted to be easier to install than the first, as they were targeted not at Linux hackers eager to get into writing code, but rather, at the ever-expanding collection of end users Linux was just beginning to attract at the time.

As Linux's user base grew, many in the Linux community began to sense a business opportunity, and the first Linux companies were formed: Red Hat, Caldera, SuSE, and many others whose names have long been forgotten. These companies formed around the concept of selling commercial distributions to the expanding Linux user base. A third generation of Linux distributions was born.

The commercial opportunity was ripe, as the primary means of acquiring Linux to that point had been the Internet, and up to that point, the primary users of Linux had been students at universities, where Internet access was plentiful. However, in the broader population where Linux was beginning to get noticed, potential Linux users were lucky to have dial-up access to online systems such as CompuServe. Combined with the rising popularity of CD-ROM drives and the growing size of distributions to incorporate more and more software to appeal to a wider and wider audience, the first business models for Linux were born.

These business models served the first Linux companies well through most of the 1990s and, indeed, this is where the term "Linux distribution" originated—the companies themselves were little more than assemblers and distributors of Linux software, including the Linux kernel, the GNU compiler toolchain, and the other software that came with a typical Linux system. As the typical Linux user became less and less of a technologist and more and more of a traditional end user, the focus of the distributions shifted from simple assembly and distribution to making the distributions easier to install and use.

Linux Commercialization at a Crossroads

Of course, as distributors of a commodity (for, after all, any company could easily become a Linux distributor—all the software being distributed was free), these new Linux distribution companies lacked the "proprietary advantage" every business needs to survive, not to mention thrive. So, following time-honored tradition, many of the Linux companies kept their "value add" proprietary in an attempt to better compete with each other.

For a time, one company took a different approach: Red Hat. After a brief flirtation with proprietary extensions, Red Hat announced its products would include only open source software. Why? It listened to what the market was telling it. The scores of Unix refugees, now occupying important positions in the companies that were adopting Linux in droves, had already been down that path; furthermore, the giants of the industry now supporting Linux, which by now included virtually all of the companies that had participated in Unix's destruction and had seen the consequences, saw Linux as a commodity platform that could recapture the position they had collectively handed to Microsoft in the early 1990s. As a result, Red Hat emerged as the market-leading supplier of Linux software.

However, as the Linux market continued to grow, and as it began to take a place at the core of the computer industry, Red Hat bumped up against its own ceiling, caused by lack of proprietary advantage—other companies were beginning to take in billions of dollars per year in revenue from Linux-based sales, while Red Hat seemed to have hit its peak at $100 million or so.

To counter this, Red Hat came up with a strategy that was still in keeping with its "100% open source" market position. Instead of focusing on selling Linux as a boxed product, it would sell software updates to those boxed products in the form of annual subscriptions. This strategy by itself proved inadequate, as the software updates it distributed were available for free. So, it combined the new strategy with another maneuver, a redefinition of the "Linux platform" to one it could define and control itself.

Moving away from its traditional, freely redistributable Red Hat Linux product line, it launched Red Hat Enterprise Linux. The key part of the strategy behind Enterprise Linux was that independent software vendors (ISVs) and independent hardware vendors (IHVs) were directed to certify to this new "high-end" Linux platform, while the old Red Hat Linux was relegated to software developers and infrastructure roles. The other key part of the strategy was that Enterprise Linux was no longer freely redistributable—the acquisition of the product was tied to the subscription, and any redistribution of the product caused the subscription to be null and void.

In other words, if Linux users wanted access to the applications and hardware certified to Red Hat's platform, they had to run Enterprise Linux. To run Enterprise Linux, they had to acquire it from Red Hat via the new subscription model, which entailed signing a subscription agreement that forbade them from redistributing it. More precisely, customers were still free to redistribute Enterprise Linux, but in doing so, they lost all support from Red Hat and, most importantly, from the legions of ISVs and IHVs that certified to the Red Hat platform. Red Hat's transformation was complete when it dropped its Red Hat Linux product line altogether in 2003. Red Hat's new model was still in keeping with the letter of the open source movement but no longer with its spirit.

Proprietary Linux?

By any measure of the term, this is proprietary lock-in, albeit proprietary lock-in that does not involve the traditional attainment through source code intellectual property—i.e., proprietary software. In a way, Red Hat has learned from the lessons of Dell's success: it has come up with a clever new business model to match the commodity market in which it competes—operating systems used to be sold as products in boxes or bundled with other products, and Red Hat realized this approach would not be profitable in the new operating system market Linux was helping to create; so it found a new way to sell its operating system products that was profitable.

However, in a very real way, Red Hat's model is also dangerously close to the model employed by the Unix vendors, which had catastrophic consequences. It is attempting to decommoditize the Linux platform, not through proprietary extensions in the form of software, but through a redefinition of the Linux platform to its own ends and the restriction of how that platform can be used and redistributed. Sure, the source code to its platform is still freely redistributable, but with the shift of proprietary position away from source code intellectual property and to third-party relationships and subscription agreements, the rules of the game have changed dramatically here as well.

What's at Stake?

Red Hat's new business model may be helping its revenues in the short term, but is it in Red Hat's best long-term interest—not to mention the best interest of the Linux ecosystem as a whole—if Linux is owned by a single company, or if Linux fragments like Unix did as Red Hat's competitors follow down the proprietary Linux path?

If Red Hat's business model is wrong, what is the right business model for Linux distribution vendors? In my view, the Dell model can be taken a step further than any of the Linux distributors have thought to take it. After all, what are open source technologies but commodity software components, and what are Linux distributions but assemblers of those components into products the end customer finds useful?

Indeed, such an "assembler of commodity software components" business model might fully realize many of the benefits of Linux that the traditional, product-oriented business models of Linux distribution companies have failed to capture: flexibility and choice, without the substantial expertise and financial investment required to adapt a Linux distribution for its own purposes. What if a Linux distribution was a collection of parts that could be mixed and matched to suit the needs of the company buying it instead of a one-size-fits-all, monolithic product like the Linux distributions of today?

As with new business models that have come before it, such an approach would open Linux to new markets, markets that are already using Linux, but for whom today's product-oriented business models are ill-suited: server appliance vendors, set-top box makers, and others to whom Linux is an invisible vehicle for driving their own products. In their world, Linux is a piece of infrastructure, not a product to be owned by Red Hat or otherwise.

Indeed, this is the model being employed by my company, Progeny. Our approach is to embrace the commoditizing effect Linux and open-source software have on the software industry instead of fighting it. Since every company needs a proprietary advantage of some kind, we've chosen to focus on building advantage through our processes, not technology, much as Dell did—in other words, to leverage our expertise in distribution building to help other companies assemble commodity software components from disparate places into cohesive wholes, and to do so in a scalable and flexible way.

Beyond building a better business model around Linux, what's at stake? I contend far more is at stake, for one simple reason: Linux needs to remain a commodity, as it is now a core piece of infrastructural technology at the heart of the computer industry. Indeed, Linux was enabled by the commodity nature of the last infrastructural technology to redefine the IT industry: the Internet.

In "IT Doesn't Matter," which appeared in the May 2003 edition of Harvard Business Review, Nicholas Carr points out that infrastructural technologies "[offer] far more value when shared than when used in isolation." What happens if Linux is decommoditized and ends up being the proprietary product of a single company to serve its own purposes? What if the PC or the Internet had been decommoditized? Where would we be today?

Carr's essay provides hope that there is money to be made in infrastructural technologies that have been fully commoditized, and that there's no need to try to own those infrastructural technologies:

...the picture may not be as bleak as it seems for vendors, at least those with the foresight and skill to adapt to the new environment. The importance of infrastructural technologies to the day-to-day operations of business means that they continue to absorb large amounts of corporate cash long after they have become commodities—indefinitely, in many cases. Virtually all companies today continue to spend heavily on electricity and phone service, for example, and many manufacturers continue to spend a lot on rail transport. Moreover, the standardized nature of infrastructural technologies often leads to the establishment of lucrative monopolies and oligopolies.

Carr's essay also provides historical perspective on the commoditization process:

...infrastructural technologies often lead to broader market changes.[...]A company that sees what's coming can gain a step on myopic rivals. In the mid-1800s, when America started to lay down rail lines in earnest, it was already possible to transport goods over long distances—hundreds of steamships plied the country's rivers. Businessmen probably assumed that rail transport would essentially follow the steamship model, with some incremental enhancements. In fact, the greater speed, capacity, and reach of the railroads fundamentally changed the structure of American industry.

In a commodity world, technologists need to think about innovating in their business models as much as (if not more than) innovating in their technology. Of course, it's a natural trap for the technologist to think about technology alone, but technology is but a small part of the technology business. Look for your competition's Achilles' heel, which more often than not is an outdated business model in a changing world, not technology. To attack your competition with technology alone is to charge the giants head on, and this approach is doomed to failure the vast majority of the time.

Businesses operating in a commodity world also need to build business models with the larger ecosystem in mind. It is tempting, once the incumbents have been overthrown through the powers of commoditization, to lapse into the same old proprietary lock-in strategies that served the former incumbents so well. In effect, though, this is decommoditizing the industry, "poisoning the well." It is possible to build a successful business in a commodity market, as Dell and many others before it have shown, and in the long run, it is far better to ride the forces of commoditization than to fight them.

Notes

  1. To avoid confusion, I will use the term Linux to refer to the operating system, following standard usage. When referring to just the Linux kernel, I will say "the Linux kernel."
Personal tools