Open Sources 2.0/Open Source: Competition and Evolution/Under the Hood: Open Source and Open Standards Business Models in Context

From WikiContent

< Open Sources 2.0 | Open Source: Competition and Evolution
Revision as of 19:06, 5 May 2008 by Docbook2Wiki (Talk)
(diff) ←Older revision | Current revision (diff) | Newer revision→ (diff)
Jump to: navigation, search
Open Sources 2.0

Stephen R. Walli

People debate regularly whether open source software is "good for business," and how one makes money on something given away "for free." People raise concerns over the commoditization effects of open source,[1] and portray a gloomy road ahead where open source software will "eat its way" up a stack of functionality to the logical conclusion where software has become valueless.

Standards as a commoditization driver have been well understood for quite some time across many industries. A standard exists to enable multiple implementations. The economic argument is that they serve to broaden the market for all producers while fostering price competition (which also fosters production efficiency) for the benefit of consumers. Industry associations of vendors support such work where it expands their market opportunities in complementing areas. Governments support such work because of the "good" economic effects. Seldom does one hear complaints about this commoditization effect, and vendors continue to participate in the development of standards and compete on implementations regardless of that effect.

In this chapter, we will take a look at traditional working definitions of open standards and open source software focusing on the veneer of differences, then step back and look under the hood at a broader business context for the dynamics at work to provide a business model where standards and open source software can be seen in context.


Open Standards

A standard can be a specification, a practice, or a reference model. It is used to define an interface between two (or more) entities such that they can interact in some predictable fashion and to ensure certain minimum requirements are met. Standards exist to encourage and enable multiple implementations.

It is important to put some simple perspective on the standards discussions that follow, as books can be written about this seemingly dry subject. We will look at the context for standards defined by their development and use, a process for developing and maintaining standards, and a set of implementation issues such as intellectual property concerns, conformance and certification concerns. Finally, we'll discuss the history of the concept of "open standards."

Standardization efforts are typically divided into various categories, but the classification systems are often orthogonal. For example:

  • Standards can be categorized by the type of development organization—e.g., national or international body, industry and trade associations, and consortia.
  • Standards can be viewed as industry voluntary efforts or government-regulated efforts.
  • Standards can be thought of as formal de jure—developed specifications, or market-dominant de facto product technologies.

All standards live within a context of development and use. Many formal standards are developed by national bodies or international organizations such as ISO. These standards often define procurement policy for government organizations and large enterprises alike. Industry and trade associations develop standards relevant to their expert and specialized constituencies. In the information technology space, for example, the IEEE has a standards arm, and historically CBEMA (now NCITS) and Ecma International acted as standards development organizations in the U.S. and Europe, respectively, for IT standards. Each of these three organizations was accredited within its national and regional geographies to produce standards that could be later adopted by the relevant nationally or internationally sponsored standards organization to prevent overlapping efforts, and to build on the relevant expertise within different industry groups.

Narrowing the focus even further, consortia of vendors often arise within a specific area of technology within an industry to develop standards and specifications. The consortia often try to build specifications more quickly to expand a particular market, feeling that the more traditional organizations are too slow to deliver standards.

We can categorize standards differently if we bucket them between regulatory versus voluntary standards. Government regulation defines a separate set of concerns over the voluntary work of many organizations within industries. Such government involvement is often driven by economic concerns for the public good (e.g., communications-related standards) or safety issues (e.g., pharmaceutical testing and registration requirements or vehicle safety). Regulatory-based standards will not be discussed further in this chapter because the focus is on the role of standards and open source in market-dynamic areas rather than government-regulated areas.

Another categorization attempts to discuss the difference between de jure standards developed in a consensus-based process and de facto standards. A more accurate statement might be that de facto technology describes a market-dominant product, rather than a specification for interoperability open to all implementers.[2]

Common examples of voluntary information technology standards across this organizational spectrum include SQL, HTML, TCP/IP, and programming language standards like C/C++ and C#.

Standards act as a yardstick against which multiple competing implementations can be judged in the marketplace to make sure that certain basic requirements are met. Vendors compete on implementation beyond the standard to establish competitive differentiation in the market. Ultimately, customers choose the product that does more than simply meet their base requirements. It is this relationship among specification, implementation, and competitive differentiation that provides basic interoperability among vendors, drives competition, and spurs innovation.

All standards organizations have rules about participation, construction, adoption, and amendment. They establish processes for how meetings are carried out to promote fairness of discourse and prevent anticompetitive practices. Standards development organizations also put in place intellectual property rules to ensure participants are aware of the intellectual property landscape with respect to the standard under development.

Most standards bodies require participating holders of essential patents to announce the existence of such patents, and to make them available on "reasonable and non-discriminatory" (RAND)[3] terms if an implementation of the standard would require a license to the patented technology. Under RAND terms, patent holders cannot discriminate against a particular company or a particular platform. Standards organizations supporting such patent policies ensure that developers interested in delivering standards-based products can do so, while ensuring developers that have invested in a particular invention still have their investment respected.

It is important to remember, however, that no standards development organization can speak for the intellectual property of developers that are not participants in that organization. Standards development organizations structure their patent policies this way because they cannot be the policing organizations nor bear the liability for patent infringement cases from nonparticipants. They are neither funded nor set up to do so. Indeed, if they took on this role, they would likely collapse under the fiscal burden and serve no one.

The interesting thing to observe is that while standards exist to encourage multiple implementations, patents are government-enacted legal tools to protect a single implementation. Patents exist to allow the developing company government-enforced, time-limited legal protection of an invention by preventing others from building the invention. It allows the inventing company to recover the costs of bringing an invention to market in return for publishing the idea for future use by the broader market. A patent is in some regards the antithesis of a standard. Standards are to trade agreements as patents are to tariffs. By definition, they serve different purposes in the economic landscape.

Just as standards organizations are not organized or funded to handle intellectual property liability claims, neither are they typically the conformance certifying agencies for implementations for the standards they produce. Conformance requirements in the standards and specifications are typically simple "claim" style—i.e., you provide the functionality required by the standard and claim conformance to the standard. Organizations that care about conformance then take on the fiscal and legal responsibility of verification around the conformance claims. For example, in the government space in the U.S., the National Institute of Standards and Technology (NIST) developed a procurement process (FIPS[4]) and certification testing process for the standards that it cared to use in those procurements. The government was acting appropriately to protect and serve the public good in federal procurement policy—essentially putting public tax dollars where its mouth was to improve the return on investment. In a commercial setting, The Open Group (née X/Open) as a market consortium handled conformance claims and liability for its specifications. Beyond the testing requirement, warranties of conformance are required and a brand license is signed which is tied to the trademark usage associated with the standards it produced. Companies that wanted to use the trademark on their products in the market had to pay royalties. The X/Open standards were developed through the organization and a company paid for its seat at the specification setting table through its consortium membership dues. Conformance certification, on the other hand, was funded through the cost of trademark use.

If standards act to define a base functionality to encourage multiple implementations, essentially the greatest common denominator for a specific technology, they help create a commodity. This results in a constant, healthy tension among the standards bodies' participants as they work with each other on the standard, while simultaneously vying for market share with their different products.

The term "open" with respect to standards became a mantra in the late 1980s and early 1990s, and was tied to the concept of "open systems." As Cargill observed, "open systems" was marketing-speak for the idea that if all the vendors would just build their computing products to "open" standards, the consumer would be able to build data processing systems by mixing and matching information processing hardware and software modules in much the same way that one could mix and match stereo components to build the desired system.[5] "Open systems" was a description of the architecture the consumer thought should exist. Unfortunately, the complexity of interconnected data processing systems doesn't lend itself so readily to the metaphor of a single-purpose device (i.e., the stereo system) and the ability for plug compatibility between stereo components to solve all the attendant complexity.

"Openness" became a quality attributed to the standards that would enable open systems. The openness was an attribute of the creation process (the standard was built in some form of public, consensus-based process open to all participants) rather than an attribute of implementations of the standard.

The development model for a standard is unrelated to the development model used for the implementation of that standard. It is equally possible for a standard (open or otherwise) to be implemented in a closed proprietary software product or in an open source software project.

Open Source Software

Open source software (OSS) is a term applied to a collection of software development, licensing, and distribution practices. A lot has been written about OSS over the past decade, as various open source projects gain market importance and the license models demonstrate economic significance. Eric Raymond's original treatise[6] on the development practices remains relevant. The Open Source Initiative ( publishes the definition of the requirements a license must meet to be considered an "open source software" license. I will focus on a group of attributes of OSS projects that sets up the economic discussion to come.

OSS projects are interesting "buckets" of technology. Successful OSS projects share a number of attributes.

For instance, distributed communities with good software development practices develop technology packages that satisfy well-defined needs.

  • Software quality is a measure of community activity (i.e., the developer customers).
  • Contributions reflect the individual economic considerations of the contributor and are based on selfish asymmetric value propositions.

The projects reflect their Unix history of loosely coupled component architectures with well-defined interfaces that make it easy to assemble larger solutions (e.g., the LAMP stack is assembled from Linux, Apache, MySQL, and Perl/Python/PHP).

OSS projects develop software packages in a distributed community where the core developers that inspired the project act as a hub for the evolution of the software as a "benevolent dictatorship." Just like all successful software projects, successful OSS projects support a strong software engineering discipline and ethic at the project's core. Essentially, good software is developed by good software developers.

What makes the software "open source" is the licensing model. While a wide variety of licenses are considered "open source licenses," the basic common denominator (without relisting all the requirements from the Open Source Initiative) is that the software's source code is always freely available and users can modify it without restriction; however, requirements associated with distributing the software may exist. In similar fashion to standards efforts supporting a lack of discrimination (either in participation within the context of their community or in their intellectual property engagement goals), OSS licensing discriminates against no one. Anyone can participate in the community development of the software. Anyone is free to use the software. Anyone can see the source code. Anyone can distribute the software. In each case, requirements may be imposed by the license or reputation that must be earned in the community, which would lead some to not want to participate, but nothing inherent in the process prevents participation, use, or distribution.

An interesting dividing line in the licensing schemes is whether the license is considered "viral." A reciprocal license such as the GNU General Public License (GPL) attaches itself to new software by requiring that if the software is modified and distributed, the license is attached to the new software. This forces the "open" aspect upon new software, keeping the source code publicly available. A company may be wary of publishing the source code to its software, as it may contain trade secrets or other third-party licensed software for which it doesn't have the ability to publish the source. The classic permissive licenses arose in academic settings (e.g., the Berkeley license and the MIT Project Athena license) and had no requirement to associate new work with the license. This class of licenses was very liberal in what was allowed, and a company could easily take software, modify it, and not publish the new source code.

One of the most interesting aspects of OSS development is the economics of the community participation. Surveys have been run and much has been written about the rationale for participation.[7] The "simple" economics is that participants in a community get more than they give. It is a normal selfish asymmetric value proposition. To understand that statement, think about context for a moment. Many people in many walks of life use and value their skill sets differently in different contexts. A writer might be a technical writer or communications writer for a corporation as her paying job, but still use that same collection of writing skills teaching an English as a Second Language class in the evenings, working on a writing project with her child's class at school, and writing a sonnet to a loved one. In each case, she values her skill set differently, and the reward accordingly. Software developers are no different. The interesting aspect of community is that corporations are equally economically rational in their participation. Developers and corporations participate in OSS projects because of the same simple asymmetric value proposition. Many companies participate in OSS projects and draw upon the software to deliver the products and services upon which they base their revenue streams. We will look at this a little more closely in a moment.

Coupling the license and distribution model that ensures the source code is freely available, with a core project team that is disciplined allows for the community effect of OSS development to shine. The community of interest in a particular project can directly contribute changes and bug fixes. While there may be orders of magnitude of difference in the number of bug reports submitted, down to the number of bug reports submitted with proposed fixes, down to the number of "good" fixes that meet the bar defined by the core project team, there is definitely a net gain for the project, both from a testing and a bug fixing point of view, as well as the opportunity to find new talent for the project that wants to participate.

The Real Business Model

Customers view solutions as a network of related "bits" that have to come together in some definable fashion to solve their IT problems. This network can be defined with nodes representing various technology objects and the paths between nodes representing the relationships. This is a very informal network, but very real. For example, a solution for a new retail inventory management system will include nodes representing the existing application systems to which the new retail system must interface, computer resources on which it will run, the programming language environment in which it will be developed and maintained, the staff and their experience and skill sets that will develop and then maintain the new system, databases with which it will need to interact, . The other application systems to which the new retail inventory system will need to interface have their own historical networks. The platform resources may represent a different network view if multiple application systems share the fundamental computing platform. Companies define architectures for their IT functions to attempt to simplify the decisions that need to be made, and often publish these as internal procurement and development standards. History also counts in the network—for example, some shops always buy "Unix" hardware or always program in C or Java, because that is how their resource history has developed.

Turning the discussion around to the vendor-centric product perspective, Geoff Moore defined a model[8] in 1991 for technology adoption that suggests that once a market starts to develop, a company best leads by providing a customer the best "whole product solution." By this he means that the vendor offers its core value product proposition to the customer and then needs to wrap as much around that product as it can to present a "complete" product solution to the customer to meet the customer's broader needs, essentially mapping as much of the customer solution network as possible. Another way to think about this is that the vendor wants to provide as many complements as it can to its core product offering, covering as much of the customer's solution network as is feasible to present the best (most valuable) solution in the customer's eyes.

The business of a vendor would then be to ensure that the complements were as inexpensive as possible, indeed commoditized if possible, so that the whole solution, from the customer's perspective, is as inexpensive as possible—but the lion's share of the revenue would come to the vendor through its core offering. Several business tactics and tools are available to the vendor to try to drive these complement spaces:

  • Traditional buy-versus-build strategies can be used to ensure that as much as of the customer's solution is provided through the vendor's own brand, regardless of whether the complement products are offered as add-ons or are bundled directly with the core revenue stream.
  • Develop a rich ecosystem of add-ons by encouraging developer and partner networks to provide a richer whole solution to the customer. Publishing proprietary specifications for the complement space enables more partners to develop businesses in the complement spaces.
  • Develop tool spaces that help add complements to the complement ecosystem.
  • Provide certification programs around the core technology to ensure that there are lots of service professionals to help the customers complete and support their solution. Indeed, a company might have its own consulting services arm for parts of a solution, and provide certifications for other parts of a solution.

Taking this view, a company's assets and offerings also form a network of related products and services it matches against the customer's solutions network through the sales and marketing functions. Each node in the network has cost, risk, and revenue models associated with it, and as long as the overall revenue model is greater than the sum of the costs, the company will be profitable.

It is important to remember, however, that no company exists alone in the market to solve the customer's problems. Each vendor in a particular space must have different product networks to allow a differentiation in its sales pitch to the customer. Different vendor companies will also behave differently in their hiring and acquiring strategies to shore up their "whole product offerings."

In addition, it is important to note that one can now look at intellectual property (IP) tools (and by that I mean trademarks, patents, copyrights, and trade secrets) in context. Each of these four legal property types or tools (regardless of legal and geographical jurisdiction) provides a different set of legal protections at different costs. One is far more likely to spend heavily and strategically with IP protection tools in the spaces defining one's core product value proposition or in spaces in which one has the greatest investment, than farther out in the complement spaces of one's product offering network. Indeed, in the complement spaces, a vendor may aggressively publish (or sparingly strategically patent) to ensure that no other vendor can patent in the complement space and raise the prices on that complement.

If we now start to consider open source and open standards in this core-complement context, we see that they are simply additional tools in the tool chest to drive complement spaces. Let's look at each separately for a moment.

Open Source Complements

It becomes very easy for a vendor (OEM, ISV, or systems integrator) to bootstrap a complement product or project space for its core value proposition to its customers using open source software directly. The projects are polished to product readiness either within the company or within the community itself. To "buy" versus "build" as complement strategies for a vendor, we can now add "borrow" and "share." If a vendor joins an existing community, it can polish the OSS project to product readiness to complement its core value proposition to its customer. If it starts its own project, it can be used as the hook to find and engage with new customers around the rest of its core offering.

The engagement in the community is actually a very leveraged conversation directly with people interested in the community's project and then possibly the company's offerings. As people cross the line from community participant and software user to potential customer, they are self-selecting the vendor's services. This is a very efficient way to find new customers. This does not mean one should consider the community as a mass-marketing broadcast channel (it's not), but rather, as a public conversation with one's customers and potential customers. This is not for the faint hearted. Unlike a traditional "Go to Market" plan, the technical people have real-time unmanaged discussions with the customers.[9]

The vendor's challenge becomes ensuring that products remain products and communities are communities. Starting a community project is not that risky if the vendor plays by the rules, staffing it with good software developers that will lead the community well, and understanding that the real return is the conversation they have with customers, and the product complement effect. The "community" at large does not exist to work for free improving a company's products. This mistake is still being made despite the public experiences of the past.

The community leadership is a benevolent dictatorship. Sponsoring the community (or earning your place in an existing community) does give the vendor the opportunity to manage things on its own terms. Software stability is maintained through the community project by the leadership. Project direction is developed by the community leadership and people that have joined the community and earned their position of trust. There may not be a road map with a view three to five years out, as is almost necessary in a product, but the complement space doesn't need the rigor of the core product. Viewpoint becomes important. A customer's view of the need for a road map around a solution may not map to a vendor's view of the need for a product road map.

While a number of relatively small companies are using OSS in their businesses, large vendor participation is very interesting. [Caveat lector: the following examples are observations from the author and do not represent any direct knowledge of these vendors' business plans or models.]

IBM has made three big plays: Apache, Linux, and Eclipse. IBM joined the Apache community six years ago, borrowing a web server while selling WebSphere. It joined the Linux community four years ago while managing the commodity curve on the AIX product line and using it as a competitive shot into the Sun server market. Most recently, it has begun a "share" project creating the Eclipse project out of technology it acquired (and then it acquired Rationale).

In joining the Apache community, IBM doesn't need to maintain its own web server team and can focus its efforts on WebSphere instead. In the Linux community, it can focus on the parts of the OS that best meet its needs. Linux is clearly becoming the Unix server replacement over time. IBM's AIX product space will be replaced. It can either actively participate and position itself on the leading edge of the curve, or wait until its product space is consumed.

SAP released a complete modern relational database for free in August 2002 to drive its core business into the mid-tier customer space where the customer may not already have an enterprise-class database and may not be willing to pay the "Oracle/IBM/Microsoft" tax to get SAP R3. It was released under the GNU GPL after a two- year, 100-person investment in updating the acquired Adabas technology. SAP then partnered with MySQL AB in Sweden to "manage" the database community.

Sun Microsystems worked in the GNOME desktop community to develop, acquire, and contribute the accessibility features it needed to meet U.S. government procurement policies to complement its Linux workstation offerings. For a relatively modest investment in the tedious and difficult accessibility technology, it is getting an entire full-featured desktop environment.

In each case, the corporation is getting more than it gives, developing a complement rapidly around core offering(s). They gain time-to-market for the complement at a reduced investment. While initially met with skepticism when a large company joins an existing community, as long as that company plays by the community's rules with respect to engagement and quality, it can become as accepted as any other active participant. Depending upon the nature of the product relationship to the core and company commitment, the company may make best efforts to hire key community developers. This is not altruistic, but neither does the company expect the developers to change their community engagement. It gives the company deeper insight into the community it is looking toward for support as it develops the complement.

There is a competitive edge to OSS community development as well. Often the company takes advantage of the reciprocal aspect of the licensing to salt the intellectual property fields around it by aggressively publishing prior art, holding the complement costs down, and preventing competitors from directly monetizing their original investment in the community project software. For example, SAP is not in the database business and so may feel comfortable publishing the investment in SAPDB (now MaxDB), but it probably doesn't want Oracle, Microsoft, or IBM directly making use of that investment in their respective database products. In this case, the reciprocal license is the most business-conservative license SAP could choose. As well as driving a complement directly, the community engagement also allows the vendor to work closely with partners, customers, and potential customers to build the relationships they will need to sustain the business over time.

The other competitive aspect happens when you consider two competing vendors' product-centric networks, and how they appear to the customer. The customer is looking at things as a "whole product solution" and does not really think (or care) about what is core or complement from the vendors' perspectives. A vendor can develop a complement community directly in the path of a competitor's core value proposition to a mutual customer. It need not be a deliberate move and the sole purpose of a community; it is the icing on the cake of the multifaceted approach of a business in using OSS development and engaging with its customers.

Small companies can also easily use the OSS buckets to bootstrap product complements. Clayton Christensen's original research[10] around disruptive business models shows how small companies assemble off-the-shelf parts into underperforming products compared to the industry norm, offering those products in their own niches with different business models. As the sustained innovation around the new disruptive product develops, it eventually becomes mature against the yardstick used to judge the incumbent but at a better price for the performance, and the incumbent's business is disrupted. Consider the development of the Linux operating system—from its inception in 1991, delivered by a university student, its growth in educational use, to simple infrastructure servers, to the point in history where it is presently challenging the traditional Unix vendors' products (though it has, in some cases, become too complex to teach anymore[11]).

There is also a situation, as we shall shortly see, where a product market hits the point when customers start to be overserved, and there is a call for standardization. This means that OSS components that already represent a package with well-defined interfaces may be a rapid way to bootstrap a "good-enough" product into that market.

One thing to note in this discussion using the network of core and complements together is that there is no "stack" of technology per se. Think back to the earlier discussion of customer-centric solution networks and vendor-centric product networks. Vendors may see their world as a stack with their valuable core at the top and all the commoditized complements below, but in reality, it is simply their view through their own product stack and its relationship to the customer and their partners. A chip manufacturer views the stack very differently from an operating system company or from a middleware company (hardware design in silicon is where the value is, with operating systems and middleware and apps being less and less interesting to the chip manufacturer). The terminology of eating up the stack may have more to do with the position in which the vendor perceives itself.

Open Standards Complements

Clayton Christensen further observed in his research[12] that as companies begin to overdeliver functionality in their product lines faster than customers are able to use the new functionality—and therefore faster than customers are willing to pay for it—the market begins to call for standardization. Indeed, prior to the point where they begin to overdeliver, the market leader is often offering the technology in a tightly integrated fashion and best delivers to consumer needs in this space where the solutions typically are not yet good enough. This is the time when tight integration, not standards-based components, is the path to success. Standards develop once the marketplace reaches a point where the market leader begins to overdeliver. These are the circumstances in which a market-dominant de facto technology is at a critical point and the call for de jure standardization is possible.

The signal to standardize a technology is somewhat unclear, but there is likely a collection of factors:

  • Competitors with standards experience and similar product offering networks but different core drivers likely use the opportunity to "call for standards,"[13] hoping to reduce their own complement costs while causing a competitor grief in a core revenue stream.
  • Customers managing substantial procurement budgets will support and call for standards in the hopes of prolonging investments and attempting to reduce costs from vendors that are overdelivering. For example, the U.S. government as the largest IT buyer on the planet at the time, led the charge around the POSIX and C-language standards, quickly followed by the large companies in the petroleum and automotive industries.

If you are the one true implementer, and the market (i.e., partners, customers, and competitors) is calling for standardization in your core technology space, you have a problem. They're calling for the benefits of standards (expanding market and price competition) because they want the ability to replace you. Some segment of your customers wants the choice of multiple implementations. Your competitors are happy to support the call, as this is the thin edge of the wedge to break open your value proposition to your customer, all in the name of open systems. Your partners may be happy to support the call for standardization because they want price pressure as their margins diminish and perhaps your percentage of their Cost-of-Goods-Sold is increasing.

It is important to note that one needs to get the view of the market "right" for this sort of discussion, and hindsight is always 20/20. It is not necessarily the dominant vendor's product that is to be standardized, but the product market space. For example, one can argue that the POSIX standards (and the C-language standards, for that matter) were not about standardizing Unix systems, but rather, were an effort to standardize an OS interface for minicomputers. Digital Equipment Corp. was the dominant player in minicomputers (which became departmental servers and workstations). DEC was driving customers up the hardware upgrade cycle to support its market growth faster than customers were willing or able to absorb the change. Unix systems of the early and mid-1980s represented the best opportunity around which the market could form a minicomputer application programming standard to support customers' applications portability. While the Unix systems of the day were often less scalable, less robust, and less secure than VAX/VMS systems, the Unix operating system had been ported to most vendors' hardware (including DEC VAXen), so competing vendors could see the market opportunity.

At the same time, the PC arrived on the scene. Many have argued that the PC won against Unix systems by taking over the desktop, largely due to the inability of the Unix vendors to set a desktop "standard" fast enough. The PC certainly took the desktop by storm, but it was actually competing against nonconsumption. In a Christensen view of the world, it was put together from inexpensive parts, and when compared to minicomputers it was certainly underperforming, but it became the de facto business appliance in a document-centric world, enabling a whole new class of electronic document-centric applications. (Word processing systems companies vanished almost as fast as the minicomputer companies.) The PC was competing with nonconsumption, giving business users computing resources on their desktop instead of being stuck waiting for their business data processing applications to be developed by corporate IT, with its ever-growing systems development backlog. The Unix systems (driven by standards and an "open systems" message) were data processing-centric rather than document-centric, and caused DEC grief in a completely different space.

Christensen observed that as an area of technology is standardized, the value moves to adjacent spaces in the network.[14] The trick then becomes to ensure that one is building one's business efforts in the product network around the space being standardized. This would lead us to believe that the richer a product offering network a vendor has, among different software, hardware, and service components and products, the more opportunity that vendor has to move with the value or to define new components that the old components complement.

This core-complement product network view allows one to very rapidly see how the vendor politics in a standards working group play out. A vendor with a de facto product technology that is being dragged by the marketplace into a de jure standards working group is likely a little less than enthusiastic about participating in its own commoditization. The vendor alliances within the working groups are participants in the complement space. The game is one of technology diplomacy, where the goal as a vendor representative is to expand your area of economic influence while defending sovereign territory. This holds true regardless of whether one is participating in a vendor-centric organization such as Ecma International, as an "expert" to a national delegation to the ISO (on behalf of her employer), or as an individual contributor to an organization like the IEEE (again, funded by her employer to participate). Vendor consortia offer a similar view. Which vendors formed the consortia and which vendors quickly and noisily joined shortly afterward says a lot about who the incumbent in a product space is and who the competitors are.


Businesses are often much more than simply hardware companies or a software companies or a service providers, offering breadth of product and service in overall value proposition to their customers. Successful companies use a collection of strategies to deliver a "whole product offering" for their customers, driving their core revenue generator with a host of complementing products and services.

Standards have traditionally been one tactic or tool for driving additional complement value to a customer by developing a complement space in a maturing market with a lot of implementations at a reduced price.

Open source software can also be used as a tool to develop a complement space that supports a core revenue product or service. The open source project can act as a quick and convenient bucket of technology around which other product offerings are wrapped, or plugged into an existing product offering network.

A number of models were presented on how to think about customer-centric solution networks and vendor-centric product networks (and for thinking about the product network from a core-complement point of view), alongside Moore's traditional Technology Adoption Life Cycle and Christensen's models for how product markets behave. Open source software projects and standardization efforts can be viewed as tools to be used to attain competitive advantage. A number of large corporations are now participating in OSS communities to the benefit of the corporation and the communities, just as corporations have historically driven voluntary standards engagements. The model-based view certainly doesn't take away from the excitement inherent in different OSS projects or the overall economic value of a successful standard. It merely provides context to businesses that want to understand how to adopt and participate in either.


  1. "Will Open Source Middleware Commoditize J2EE?", Nov. 5, 2004,; visited Nov. 9, 2004. Also: Paula Rooney, "Open Source Will Commoditize Storage, Databases and Security," Jan. 20, 2004,; visited Nov. 9, 2004.
  2. As we shall see, Clayton Christensen has proposed a situation that says there is market pressure that can change de facto technologies into de jure standards.
  3. While each organization's rules are stated somewhat differently and with different levels of formality, a quick look at the governing rules of the IEEE, ISO, IETF, and Ecma International shows a remarkable similarity.
  4. FIPS stands for Federal Information Processing Standards.
  5. Carl Cargill, Open Systems Standardization: A Business Approach (Upper Saddle River, NJ: Prentice Hall 1997), 70-71.
  6. Eric Raymond, The Cathedral & the Bazaar (Sebastopol, CA: O'Reilly Media, 2001). The original essay was published in 1997.
  7. Most notable were the surveys by the Boston Consulting Group (, Dec. 15, 2004) and the broader FLOSS survey done at the University of Maastricht (, Dec. 15, 2004).
  8. Geoffrey Moore, Crossing the Chasm (New York: Harper Collins, 1999).
  9. The first thesis in the Cluetrain Manifesto is "Markets are conversations." Indeed, most of the 95 theses are highly relevant to the discussion (, Dec. 15, 2004)
  10. Clayton Christensen, The Innovator's Dilemma (New York: Harper Collins, 1997).
  11. In interviews in February 2003, a number of university OS professors made reference to the current revision, with the addition of symmetric multiprocessor suppor, becoming too complex to teach. As a result, they were basing their course work on earlier versions of Linux.
  12. Clayton Christensen, The Innovator's Solution (New York: Harvard Business School Press, 2003).
  13. Geoff Moore argues that the first response in the market from competitors when they see a "gorilla" forming is to cry for "open systems" (Geoffrey Moore, Living on the Fault Line [New York: HarperBusiness 2002], 119). This might be more of a cause for standardization too early with all the attendant problems that ensue as has been observed by James Gosling of Sun Microsystems (James Gosling, "Phase Relationships in the Standardization Process", circa 1990). Gosling's observations are more closely in line with Christensen's, arguing that there is an optimal time in a technology's development for standardization. Some of us have always suspected that it is best to standardize existing practice and experience, instead of trying to standardize ahead of the market curve. Indeed, it would be interesting to do a survey of successful and unsuccessful standardization efforts to determine whether the unsuccessful efforts were undertaken too early in a marketplace, when vendors are still trying to define the marketplace itself and stake out claims with products and patents. First, of course, one would need to define the measure of a successful standard. Christensen's observations are likely more in line with standards forming at the optimal market time.
  14. This was originally referred to as "the Law of Conservation of Attractive Profits," but is now referred to as "the Law of Conservation of Modularity."
Personal tools