Testing Is the Engineering Rigor of Software Development
Developers love to use tortured metaphors when trying to explain what it is they do to family members, spouses, and other non-techies. We frequently resort to bridge building and other "hard" engineering disciplines. All these metaphors fall down quickly, though, when you start trying to push them too hard. It turns out that software development is not like many of the "hard" engineering disciplines in lots of important ways. For example: think about the building in which you currently sit, and think about how long it took to architect, design, and build it, along with the required resources. Then, think about a software project of a similar size. Yet, in the software project, if you have a small defect, it can cause the entire thing to fail in unpredicted ways. If the light-socket cover doesn't completely cover the hole, it can't cause the building to fail, but a similarly small defect in software can cause major problems. Of course, software is easier to fix than a building, but the tolerances for perfection in software are ridiculously fine. In software, we often also don't have a direct correlation between fault and location. If a wing falls off a plane, an engineer can start snooping around the bolts that hold the wing in place to determine what went wrong.
Compared to "hard" engineering, the software development world is at about the same place the bridge builders where when the common strategy was to build a bridge and then roll something heavy over it. If it stayed up, then it's a good bridge. If not, well, then time to go back to the drawing board. Over the past few thousand years, engineers have developed mathematics and physics they can apply to structural problems without having to build it to see what it does. We don't have anything like that in software, and perhaps never will because software is in fact very different. For a deep-dive exploration of the comparison between software "engineering" and regular engineering, check out the article What is Software Design, written by Jack Reeves in C++ Journal in 1992. Even though it was written more than a decade ago, it is still remarkably accurate. He gives a gloomy picture is this comparison, but the thing that was missing in 1992 was a strong testing ethos for software.
Testing "hard" things is tough because you have to build it to test it, which discourages speculative building just to see what will happen. But the building process in software is ridiculously cheap. We can design a new thing and build it to see how it reacts. And we've developed an entire ecosystem of tools that make it easy to do just that: unit testing, mock objects, test harnesses, and lots of other stuff. In fact, other engineers would love to be able to build something and test it under realistic conditions. As software developers, we should embrace testing as the primary (but not only) verification mechanism for software. Rather than waiting for some sort of calculus for software, we already have the tools at our disposal to ensure good engineering practices. Viewed in this light, we now have ammunition against managers who tell us "we don't have time to test". A bridge builder would never hear from their boss "Don't bother doing structural analysis on that building -- we have a tight deadline". The recognition that testing is indeed the path to reproducibility and quality in software allows us as developers to push back on arguments against it as professionally irresponsible.
Testing takes time, just like structural analysis takes time. Both activities ensure the quality of the end product. It's time for software developers to take up the mantle of responsibility for what we produce. Testing alone isn't sufficient, but it is necessary. Stuart Halloway has a great quote about software quality: "In 5 years, we will view compilation as the weakest form of unit testing". Implicit in this quote is the realization that testing is the engineering rigor of software development.