Beyond Java/The Perfect Storm

From WikiContent

Jump to: navigation, search
Beyond Java

The power and the fury of the storm caught us off guard. El Niño, a weather pattern famous for producing a continuous stream of storms in Texas, seemed to misfire over and over. The core of the Austin kayaking community, dependent on storms to fuel our unfortunate addiction, sat frustrated around an ancient TV with a snowy signal, watching storm after storm split up and float completely around us. Around 11:00, everything changed. Like every day leading up to this day, a line of storms lay spread out before us like kids at a Harry Potter movie on opening day. Only this time, they punched Austin, hard.

El Niño, the split jet stream, filtered across the ocean and brought warm, moist air right across Texas. It collided with the cooler air of a cold front. The pressure system in the South fed a rotation, and locked the cool front in place. The warm air exploded into the cold and produced a perfect storm. We opened the topological maps and found a stream that had never been run. It had the steepness and geographical features that we were looking for. It simply had not had enough water. As we planned the trip, the mighty storm hurled a string of consecutive lightning bolts right near a hilltop, less than a mile away. Distracted, we stared into the night, alternately black and blinding.

Contents

Storm Warnings

To know where Java is going, you've got to know where it came from. You need to remember the conditions that caused us to leave the existing dominant languages in droves. You must understand the economic forces that drove the revolution. And you cannot forget the sentiment of the time that pried so many of us away from C++, and other programming languages for the Internet.

In 1995, Java was working its way through the labs of Sun Microsystems, unborn. Sun garnered attention as a champion of standards, and for bringing Unix out of the academic ghetto, but it was not a major player in development environments or programming languages. Frustrations, driven by economics but stemming from inadequacies in programming languages and programming models, rippled through the community in another kind of gathering storm.

Economics of Client-Server Computing

Frustration with long development cycles and inadequate user interfaces drove many companies to move off of mainframe computers. At first, the movement amounted to nothing more than a trickle. As the cost-cutting financial offices measured the software and hardware costs of IBM versus Microsoft on Intel, the trickle became a flood.

But the wave of migrating customers did not consider all the costs. The rapid movements from mainframes to Intel servers drove the first tsunami of chaos because the client-server movement hid significant costs:

  • Management costs skyrocketed. It was too difficult to deploy tiny changes to hundreds of fat clients. Technologists could not figure out how to maintain the many desktop applications and frameworks necessary to make the architecture go.
  • Many customers became increasingly wary of a gathering Microsoft monopoly.
  • The tools of the day made it easy to get started, but did not handle complexity well. Typical customers simply could not make them scale.

Decision makers were caught between the pragmatic approach of a centrally managed solution and the adaptability and lower costs of Intel-based servers. They waited for a better solution, and the clouds darkened.

Microsoft

While developers struggled with C++, Microsoft planned to hammer the final nails in the coffin of OS/2, a competing operating system that it once created, but abandoned to IBM. So Microsoft grew in stature and influence, and it learned to cater to developers very well. Companies like IBM dominated the infrastructure groups (called IT for information technology). Microsoft didn't care. It went straight to the lines of business that used IT applications. Offering quick turnaround time with Excel macros and Visual Basic applications, it stole a large part of development mindshare across the world. Screw IT. The line of business could build the applications itself, and involve IT only after the fact, to clean up the resulting mess.

Microsoft grew, and some of the same people that lauded the end of OS/2 began to grow wary. Microsoft's dominance was a double-edged sword. You didn't have the problem of navigating through a bewildering sea of products and solutions. You didn't have the oppressive integration problems of making multiple vendors work together. You just pitched all the competition and looked to Redmond for the answers. But you had to be willing to give up other choices, and you had to live with the answers that you got. An evolving API stack moved quickly through OLE to COM to COM+. Operating systems' APIs changed from Win to Win32. New flavors and options emerged with new operating systems.

Microsoft captured a core of diligent developers more or less completely. Others bought some of the message, but cast a wary eye northwest. A growing core of developers looked openly for alternatives, like Novell's Netware or various Unix-based alternatives. Individual products, like Netscape Navigator, emerged to compete with Microsoft. The gathering storm seemed imminent.

The Internet

Thunder began to rumble in the distance, in the form of a rapidly growing Internet. In 1995, most people used the Internet to share static documents. Most dynamic sites were powered by command-line scripts through an interface called Common Gateway Interface (CGI) , in languages like Perl . That approach didn't seem to scale very well. While Perl was a very efficient language, applications were hard to read and difficult to maintain. And CGI started a new shell for each request, which proved prohibitively expensive. For enterprise computing, the Internet had the reputation of a limited toy, outside of scientific and academic communities.

In the mainstream, Microsoft seemed to miss the significance of the Internet, but many of the brightest minds in other places looked for ways to combine forces, to defang the dominant menace in the northwest. Market leaders always strive to protect their base through proprietary products and frameworks. Everyone else loves standards. IBM, which once built an empire on proprietary models encompassing hardware, software, and services, suddenly did an about-face, embracing every standard that it could find. It Internet-enabled its main products like its DB2 database through a product like net.data and its mainframe-based transaction engine through web-enabled emulators. Other companies also built better servers, and more efficient ways to share dynamic content. Netscape rose to prominence with a popular web browser. It looked for a way to share applications with documents, and found the answer in a fledgling language, recently renamed from Oak to Java. It started to rain.

Object Orientation

Object-oriented systems support three ideas that you now take for granted: encapsulation, inheritance, and polymorphism. For many years, the industry had been working toward object-oriented programming (OOP) . They tried several times, but it never quite came together. The first major attempt was with Smalltalk . It was a highly productive environment, but when less-experienced developers tried to push it beyond its natural borders, they had problems. Initially, the early hype around OOP was counterproductive. It positioned OO languages as tools to achieve reuse, and suggested that inexperienced OOP teams could be many times more productive than their procedural counterparts.

Object-oriented software has the potential to be much less complex than procedural programming, but it takes some time to build the expertise to recognize patterns and to layer OO software in a way that makes sense. It also took the industry time to deliver educated developers. Though it now looks like OOP exploded overnight, that's not the case at all. After some early failures with languages like Smalltalk, systems programmers went back to the drawing board to deliver a less-ambitious version of an OOP language, and worked on delivering OOP concepts in a more limited way, as you see in Figure 2-1:

  1. Smalltalk, invented in 1971, was successful as a research project, but did not experience the same success commercially.
  2. In the late 1970s and into the 1980s, APIs for things like presentation systems began to organize the interfaces into logical actions, called events, around objects, like windows and controls.
  3. In 1980, the United States Department of Defense commissioned the Ada programming language, which offered some of the features of OOP, like encapsulation and inheritance.
  4. Companies like IBM and Microsoft delivered toolkits to let their users express object-oriented ideas in procedural languages. The most notable were IBM's System Object Model and Microsoft's Component Object Model.
  5. C++ let C developers use C procedurally, and also develop object-oriented applications, side by side.
  6. Java was invented, combining many of the inventions along the way.

Figure 2-1. This timeline shows the slow commercial acceptance of object-oriented programming

This timeline shows the slow commercial acceptance of object-oriented programming

Unfortunately, C++ came with its own sorts of problems.

The C++ Experience

As programmers wrestled with OOP, they also dealt with issues related to their chosen language . Visual Basic developers began to understand that the language and environment may be simple, but it is prone to poor performance and poor designs, leaving customers stranded with slow applications that they could not extend or maintain.

In C++, server-side developers found performance, but discovered another challenge. They did application development using a systems programming language. New terminology like memory-stompers and DLL Hell gave testament to the frustration of the masses. Simple problems dogged them.

Pointer Arithmetic

With C++, a pointer could point to any block of memory, regardless of whether it was the intention of the programmer. For example, consider the simple program in Example 2-1. It moves a block of memory from one location to another, and inverts it. Unfortunately, the example is off by 1. The code touches memory one byte beyond the from block. You would probably not see the error right away. You'd see it later, when you tried to manage the memory of this block, or another one. C and C++ compilers often manage memory with a linked list, and the pointers to the next block in the list sit just outside the allocated blocks! These types of errors hurt systems developers, and absolutely murdered applications developers, who didn't have the background to effectively troubleshoot these types of problems. Reliability also suffered.

Example 2-1. Move and invert a block of memory

// move and invert from_block into to_block with size size

int i;
for(i=0; i<size; i++) {
  to_block[size-i] = from_block[i];  // off by one!
}

Nested Includes

One of my most vivid and frustrating memories from working with IBM came from porting a C++ application that had include files nested 37 layers deep. It can be a very difficult problem to manage, especially for inexperienced developers.

The problem goes something like this. In C++, you specify interfaces to your methods, with other supporting information, in a header file, or .h file. For example, in MySQL, you have a main include file that has these includes (I've omitted most of the code for brevity):

    #ifndef _global_h               /* If not standard header */
    #include <sys/types.h>
    ...
    #include <custom_conf.h>
    ...
    #ifdef _ _LCC_ _
    #include <winsock.h>            /* For windows */
    #endif
    ...
    #include "mysql_com.h"
    #include "mysql_version.h"

That doesn't look so bad, until you consider that some of these includes are compiled conditionally, so you really must know which compiler directives are set before you can decide definitively whether something gets included. Also, one of your include files might include another include file, like this line in mysql_version.h:

    #include <custom_conf.h>

In truth, this MySQL tree goes only three levels deep. It's an excellent example of how to code enterprise software in C++. It's not usually this easy. Any dependency will have an include file, and if that code also has dependencies, you'll have to make sure those include files and their associated libraries get installed and put in the right place. Lather, rinse, repeat.

Java does not have this problem at all. You deal with only one type of source file, with one kind of import, and no conditional compilation.

Strings

Many of the largest corporations used C++ for enterprise application development, even though it had very limited support for managing strings . C programs simply used arrays of characters for strings, like this:

    char str [  ] = "Hello";

This is going to allocate a fixed-length string to str. It's merely an array of characters. And it can never hold a string longer than six characters. You could decide to use the C++ string library instead.

C++ did support the C-style string library for some string-like features. For example, to assign one string to another when the memory has already been allocated, you need to copy the bytes instead, like this:

    strcpy (string1, string2);

C-style strings were ugly, dangerous, and tedious. As with any other type of pointer manipulation, you can walk off the end of a block and create an error that may not be discovered for hours or months. C++ strings are far more tedious than alternatives in languages, including Java.

Beginning in 1997, the ANSI standard for C++ introduced a more formal string. You could have a more natural representation that looked like this:

    String str = "Hello, I'm feeling a little better.";

And many C++ libraries had proprietary string libraries. But the damage was done. Many programmers already knew C, and never used the C++-style strings.

DLL Hell

On Microsoft operating systems and OS/2, you compiled libraries that might depend on other libraries. The operating system linked these together with a feature called Dynamic Linking Libraries (DLLs) . But the OS did not do any kind of dependency checking. As many applications share versions of the same programming libraries, it was possible, and even probable, that installing your application might replace a library that another application needed with an incompatible version. Microsoft operating systems still suffer from DLL Hell today.

CORBA

As the C++ community grew, they looked to distribute their code in ways beyond client-server. Common Object Request Broker Architecture, or CORBA, emerged quickly. With CORBA, you could build applications from objects with well-defined interfaces. You could take an object, and without adding any remoting logic you could use it on the Internet. Companies like IBM tried to push a CORBA model into every object, and companies like Iona focused only on distributed interfaces around remote objects. The kindling around CORBA began to smolder, but never really caught fire. The distribution that was so transparent and helpful was actually too easy. People built applications that relied on fine-grained communication across the wire. Too many round-trip communications led to poor performance and reputation problems for CORBA.

Inheritance Problems

C++ nudged the industry in tiny steps toward OOP, but the steps often proved awkward and counterproductive. C++ had at least three major problems:

  • C++ actually did not force object orientation. You could have functions that did not belong in classes. As a result, much of the code written in C++ was not really object-oriented at all. The result was that the object-oriented C was often more like (C++ )—.
  • C++ did not force one root object. That led to object trees with many different roots, which proved awkward for object-oriented developers.
  • C++ supported multiple inheritance . Programmers had not accumulated the wisdom born from experience to use multiple inheritance correctly. For this reason, many languages have a cleaner implementation of multiple inheritance, called a mixin .

Multiple inheritance is a powerful tool in the right hands, but it can lead to significant problems for the novice. Example 2-2 shows an example of multiple inheritance in action. A Werewolf is part Man and part Wolf. Problems arise when both Man and Wolf inherit from a common class, called Mammal. If Werewolf then inherits a method introduced in Mammal, it's ambiguous whether Werewolf would inherit through Man or Wolf, as in Figure 2-2. This problem, known as the diamond inheritance problem , illustrates just one of the problems related to multiple inheritance.

Example 2-2. Multiple inheritance in C++

class Werewolf: public Man, public Wolf

Multiple inheritance is like any power tool. It gives you leverage and speed and can save you time, but you've got to have enough knowledge and experience to use it safely and effectively to keep all your fingers and toes. Most developers using C++ as an applications language had neither.

Figure 2-2. The diamond inheritance problem is just one of the complexities that can arise with multiple inheritance

The diamond inheritance problem is just one of the complexities that can arise with multiple inheritance

Consistency

Like Perl, C++ is most definitely an expressive language, but that flexibility comes at an incredible cost. C++ is full of features that might make sense to a seasoned developer, but that have catastrophic effects at runtime. For example, = often doubles as an assignment and a test. Most new developers will get burned by this problem. It takes years and years of study and experience to become proficient with C++. For systems development, that makes sense, because you ultimately need the performance and control inherent in the ability to put every byte where you want to. Applications developers simply don't want to deal with those low-level details.

Portability

Most developers expected C++ to be more portable, but it didn't turn out that way. We were buried under mountains of incompatible libraries, and inconsistencies between libraries on different platforms. C++ left so much in the hands of the vendors implementing the spec that C++ turned out to be one of the least portable languages ever developed. In later years, problems got so bad that you often couldn't link a library built by different versions of the same compiler, let alone different operating systems.

Like mud accumulating on a boot, the language that once looked so cool on a resume began to weigh down the brightest developers, and stymie lesser developers completely. Instead of moving to a limited language like Visual Basic or Power Builder, they waited, and the storm clouds grew darker still.

Compromises

You don't get a perfect storm without all the conditions. The primary success in the initial Java explosion was based on the extraordinary migration of the C++ community. To do this, Java had to walk a tightrope with excellent balance. C++ had some obvious warts, like an awkward syntax, multiple inheritance, primitives rather than objects, typing models, poor strings, and awkward libraries. In some cases, Sun decided to opt for a simpler, cleaner applications language. Java's research roots as an embedded language drove a simplicity that served it well. In other cases, it opted to cater conservatively to the C++ community.

It's easy to look at Java now and criticize the founders for decisions made, but it's clear to me that they walked the tightrope very well. The rapid growth of the hype around Java and the community allowed a success that none of us could have possibly predicted. All of this happened amid an all-out war between Microsoft and IBM! If Java had stopped at this point, it would have been successful. But it didn't stop here. Not by a long shot.

Clouds Open

The sound and fury of the Java storm caught many of us off-guard. And why not? It came from an unlikely source, was delivered in an unconventional vehicle, and defied conventional wisdom regarding performance of interpreted languages. Other than the language, nothing about Java was conventional at all, including the size of the explosion. In retrospect, you can look back and see just how well it filled a void. Figure 2-3 shows the many ingredients that come together to form the perfect storm.

Figure 2-3. Many forces formed the combined ingredients that led to a perfect storm

Many forces formed the combined ingredients that led to a perfect storm

New Economics

The jet stream that powered this storm emerged from a series of standards: TCP/IP, HTTP, URI, and HTML. The Internet gathered steam, and Sun took full advantage with Java. The Internet was everywhere. Java was cool. The Java developers quickly built the API set that would allow developers to code for the Internet, including TCP/IP APIs for communication, and applets for building user interfaces that you could embed in a browser. JDBC allowed database access.

The perfect combination formed by the relationship between Netscape Navigator and Java drove each company to new heights. Through Netscape, Sun was able to put Java in front of an incredible number of developers, nearly instantaneously. Through Java, Netscape could showcase smart applications that looked cool, and were simultaneously practical. The Navigator/Java combination seemingly solved the most critical problems of client-server computing: management and distribution. If you could install a browser, you could then automatically distribute any application that you wanted through the browser. Java had the perfect economic conditions for success. Java found an important ally in the bean counters that liked the manageability of the green screen, but the productivity and usability of the fat client.

Customers wanted solutions, and Sun realized that Java would give them what they wanted. Sun immediately saw the opportunity it faced. With the open standards around the Internet and the Java language powering it, Solaris on Sun servers would be a compelling, and even hip, alternative. Above all, Java made Sun safe. Because its virtual machine ran in a browser and on many different operating systems, some hard decisions didn't seem so hard. You could try out a deployment scenario. If you didn't like it, you could just move on.

The new jet stream was in position to feed power to the growing storm.

C++ on Prozac

When Lucene founder Doug Cutting called Java C++ on Prozac,[1] I immediately liked the comparison. Because of its C++ syntax, Java found an impressive waiting community of developers looking for a solution. They moved to add a hip Java, and Internet experience, to their resumes. They stayed because they liked it. Java had most of the benefits of C++, without the problems. The similarities of the languages made it easy to learn. And Java was liberating, for many reasons:

  • Java provided more structure in places that needed it, such as providing interfaces instead of inheritance.
  • Java eliminated the burden of pointers, improving stability and readability.
  • Garbage collection got easier, because the JVM automatically took care of abandoned references.
  • Java allowed a much better packaging mechanism, and simplified the use of libraries.
  • Java cleaned up problems like nested include files and macros.

Architecture

The benefits of Java went beyond economics and C++. I can still vaguely remember the first sentence that I saw describing Java. Sun said it was a portable, safe, secure, object-oriented, distributed programming language for the Internet. Those words were all buzzwords of the time. For C++ developers, Java underpinnings made significant strides:

  • The JVM allowed unprecedented portability. Many experts believe that the JVM, and not the language, is the most important feature of Java. Sun marketed this capability brilliantly with the acronym WORA. Java developers the world over recognize those letters as standing for Write Once, Run Anywhere.
  • Java published the byte code specification for the JVM. People who want to build their own JVM or build a language on the existing JVM standard can do so, or even modify byte codes of existing applications. Frameworks like JDO do modify byte code with great success.
  • While C++ allowed unrestricted access to application memory, Java restricted access to one area of the JVM called the sandbox. Even today, you see very few exploitations of Java security.
  • The Java metamodel, made up of the class objects that describe types in Java, allowed sophisticated reflective programming. Though it's a little awkward, the capabilities of Java extend far beyond the basic capabilities of C++. The Java metamodel enables frameworks that increase transparency, like Hibernate (persistence) and Spring (services such as remoting and transactions).
  • The fathers of Java saw the importance of security, and baked it into the language. Java introduced a generation of programmers to the term sandbox , which limited the scope and destructive power of applications.
  • Java had improved packaging and extensibility. You could effectively drop in extensions to Java that transparently added to capabilities of a language. You could use different types of archives to package and distribute code.

Both the low-level grunts and high-level architects had something to love. Businesspeople had a motivation to move. At this point, if all else had failed, Java would have been a successful language. But it didn't fail. The winds just kept picking up speed, and the storm started feeding on itself.

Fury Unleashed

Applets captured the imagination of programmers everywhere. They solved the deployment problem, they were cool, and they were easy to build. We're only now finding a set of technologies, based on the ugly and often objectionable JavaScript language, that can build rich content for the Web as well as Java did. Still, applets started to wane.

Even today, I think that applets represent a powerful idea, but they fizzled out for many reasons. The Netscape browser's JVM was buggy and unpredictable. Further, with such a rapidly evolving language, applets presented many of the same problems that client-server computing did. You may not have to maintain applications, but you still had to maintain the browser. After you'd deployed a few Java applets, you had to worry about keeping the right version of the browser on the desktop. As the size of the JVM grew, it became less and less likely that you could install a JVM remotely. Even if you could, Java versions came out often enough, and were different enough, that new applications frequently needed to materialize. But a few mad scientists at Sun were up to the challenge again.

Servlets

As applets were winding down on the client side, the server side was just getting going. Servlets gave Java developers a way to write applications that would run in the browser. An application would get a request over HTTP, and build a plain web page, with no Java, that would return to the client. Since the web pages were built server side, they could take dynamic content, like the results of database queries, back down to the client. So-called web-based applications finally delivered the goods: now, you could run enterprise applications on a client. You'd only have to deploy them on a server.

It didn't take long to understand that the clients could be within the firewalls of a company, but they didn't have to be. Since people everywhere had Internet access, it opened up the possibility of selling a whole new kind of product: information. The new economy was born. At least in part, it was powered by Java, and the companies that built the servers, databases, and software. Start-up companies sprung up to take advantage of this opportunity. Enormous paper wealth was created. Venture capitalists funded good ideas and bad. A drive for customers fed the fury of the storm. The rules were simple: he who gets the most customers wins. Start-ups were often willing to spend far more to acquire a customer than that customer could possibly generate.

Real wealth was created, too. Companies like eBay and Amazon fueled a new kind of economy without buildings or walls. This new sophisticated commerce drove a new need for new tools. Sun, Oracle, BEA, and IBM worked on new standards to enable enterprise on the Web. IBM coined the term e-business to stand for a new, powerful way to serve customers.

J2EE

J2EE, or Java's enterprise edition, included many new ways to connect to the enterprise. Under great expectations, the Enterprise JavaBeans? (EJB) spec emerged to add a rich set of tools that would let you program distributed, transactional, secure, and persistent applications, without coding those services yourself. Clustering features enabled good scalability and reliability. These features let major companies move into the Java world without reservation.

Though EJB never quite fulfilled its promise, the specification is an extraordinary example of how an idea can energize a community. The specifications behind EJB are tremendously important, and for the most part, are factored very well. Java thrived on the server side and was off to the races again.

Industry Standards

It's tough to unite through common interests. Java never could have thrived to the extent that it has with only Sun behind it. Some unifying force needed to hold them together. A common enemy in Microsoft was the perfect catalyst.

Software is more prone to monopolies than most other industries because software moves fast and obsolescence can devastate a company. For this reason, market share tends to favor the market leader heavily. So it stands to reason that market leaders love to be proprietary. They can increase market share through their leadership position, and lock their customers in to extend the monopoly. Certainly, Microsoft is not the first company to use this strategy. IBM was incredibly proficient at this game.

If being proprietary works for the market leader, the followers need open standards to level the playing field. If you can't build dominant share, you can lend your customer safety by creating partnerships and embracing a common standard. In this way, your customers are not nearly as afraid of obsolescence.

The Unix operating system helped smaller proprietary server vendors survive for years in the face of market dominance by Intel and Microsoft. After supporting proprietary systems aggressively for decades, IBM is embracing open standards in many areas, including relational databases (where it trails Oracle), operating systems (where it made mainframes a much safer solution with the open source Linux environment), and now, with Java.

IBM is now the most prevalent Java developer. It claims to have more Java developers than any other company, including Sun. I believe IBM. It has been working to catch BEA's Web Logic application server for years, and has now passed BEA. I'd expect IBM to exercise its dominance to build in proprietary features that interest its customers. I would also expect IBM to take a harder line with the Java Community Process (JCP), to force through changes that it finds most interesting. Failing that, it may leave the JCP and seek another avenue for establishing standards. If it does, this strategy should not come as a surprise. It's the prerogative of the market leader, and the dance goes on.

Open Source

Many open source communities look down on Java. That's ironic, because Java has more thriving open source software than any of the alternatives. When you build something that's both hip and popular, people want to play with it and share their creations. Add a massive community that's stretching a language in unexpected ways, and you need only to stand back and watch interesting things happen. And boy, did Java open source happen.

At first, Sun resisted the open source community . Sun developer, James Duncan Davidson, worked to change that. He built two of the most important Java applications ever in Tomcat (that showcased servlets) and Ant (that builds nearly all Java applications today). He then pushed them out to the open source community.

The typical open source development cycle works as follows (and shown in Figure 2-4):

  1. Build. Once Java geeks solve a problem often enough, they often build the solution with their own resources. Sometimes, they're solving business problems. Other times, they're just having fun.
  2. Use. Users then exercise the solution. Those that don't get used atrophy and die.
  3. Refine. Users then refine the solution, to match their requirements.
  4. Contribute. Users then contribute to the project, either with feedback or with code enhancements. They are willing to do so, because they won't have to maintain enhancements.

Figure 2-4. The open source feedback cycle is tremendously important to Java

The open source feedback cycle is tremendously important to Java

In this way, some fantastic frameworks evolved to form the foundation of Java web-based development. Today, you'd be hard-pressed to find a major company that does not take advantage of open source software. These solutions are pervasive in the Java community:

  • Developers use JUnit to build automated test cases, which run with every build.
  • IT shops run Apache Web Server as their preferred web server.
  • Customers deploy many lightweight applications with Tomcat as a servlet container.
  • Developers look to Hibernate for persistence.
  • Web-based developers use Struts to separate model, view, and controller layers of their applications.
  • Programmers worldwide use Ant to build applications.
  • Other frameworks like Lucene (search), Spring (infrastructure), Tapestry (web-based component design), JBoss (J2EE), and many others seem to be gaining popularity as well.

You might think that open source development would threaten software companies that build software, but the converse is true. Open source has served Java very well. Innovation in the open source community keeps tremendous pressure on software companies to keep up. That's healthy. If you're providing real value, you'll thrive. If you try to live off of technology that's well understood and popular, you'll die. Open source software raises the bar of what you've got to do to make money. IBM has dealt with the pressure well. BEA is withering under the heat, with IBM above and JBoss below. Either you will see BEA innovate, or an open source framework like JBoss, Geronimo, or Spring will catch it on the low end. Either way, you'll win.

You could even argue that open source software is driving the most important innovation in the Java space. Open source is driving adoption and implementation of integrated development environments, aspect-oriented programming, lightweight containers, persistence, unit testing, and the best web MVC frameworks. It's driving the actions of the largest and most powerful Java vendors. That's an incredible testament to the might of the Java open source community.

Aftermath

I believe that Java is now the most successful programming language ever. It redefined the way we package and deliver software. It changed the way we feel about interpreted languages, and the way we build Internet applications. Java changed the very economics of application development by bringing deployment and management into the overall equation. It built a new affinity for libraries, with strong web-based support. Java ushered in a massive wave of important standards that now form the very foundation of enterprise software development. Java has changed the rules of the game—Java completely rewrote the rulebook defining what it takes to be a commercially successful programming language.

In some ways, Java's new rulebook will serve us well. To achieve similar success, a new language will need to be portable and encourage a vibrant open source community. It will need broad appeal, across low-level programmers and architects. It will need to embrace compelling standards.

But technology is only part of the problem. For a new language to succeed, you'll also need a compelling business reason to switch. In some ways, Java held us back by discouraging competition. You may be tempted to use Java, even if it's the wrong tool for the job. You may work harder than you have to, because you're not free to explore alternatives. And this situation may lure us into a false sense of security, just as so many Java developers feel so comfortable wholly inside Java's cocoon.

Moving Ahead

We may never again see a perfect storm like the one that ushered in Java. You shouldn't look for one. Instead, you should learn from the success of Java, and start to understand the factors that led to its success. Minimally, I believe the next commercially successful programming language will need to satisfy four major criteria:

  • It will need to establish a significant community. You won't see broad adoption unless the adopter can achieve relative safety.
  • It will need to be portable. Java's virtual machine has raised the bar for languages that follow.
  • Some economic incentive must justify the movement. Currently, productivity to me looks like the logical economic force, but others may be lurking out there, like wireless computing or data search.
  • It will need demonstrable technical advantages. This is actually the least important of the major criteria.

I don't think most of us can possibly thoroughly understand the success of Java. It's easy to overestimate the role of the language and to underestimate the importance of the JVM and the community. In the next chapter, we'll continue to look at the crown jewels of Java in more detail, or the foundation for the most successful programming language ever.

Notes

  1. TheServerSide.com, "Doug Cutting—Founder of Lucene and Nutch," Tech Talk (March 10, 2005); http://www.theserverside.com/talks/videos/DougCutting/interview.tss.
Personal tools