Home

Software Development Paradigm Trap main page

Further Reading

Feedback: Software Development Paradigm Trap

David A. Land - 5/31/2006

I strongly agree with the thrust of your article, which is that there is no fundamental reason why software has to be so buggy and/or dysfunctional.

I started my career as an electrical engineer in 1965, so I kind of grew up during the birth of integrated circuits, minicomputers, and microcomputers. Along the way, I was thrust into the world of programming and did it, watched it, led it, and/or helped it develop in various capacities.

And I heartily agree that the typical software development paradigm is deeply flawed. Many projects begin by selecting the language and the development tools. This would be like starting a mechanism design by selecting "welding" and specifying the fuel supplier and brand of torch to be used... even before the end item features are defined.

It has always been my view that all standards must relate to properties of the Thing being implemented, and that each of them must be easily explainable to the end user of the Thing in terms of their function within the Thing.

And it is my experience with many multidisciplinary projects over four decades that if everyone has as their primary focus the success of the Thing instead of the methodology used to create the Thing, project milestones, QA benchmarks, and product release schedules pretty much take care of themselves... even when multiple hardware and software groups are involved at widely dispersed locations.

When people subordinate product feature functionality to implementation methodology, bad things happen. For an automotive example, consider the airbag... which was a government-mandated method... which subordinated "safety," the product feature... with the result that the device as deployed actually causes deaths which would not otherwise occur (the worst possible kind of "bug"). The same thing happens when "Use dot net," "Use RPC," or even "do it with software" replaces a product functional specification.

I enjoyed the article. Keep fighting the good fight.

David A. Land, President
LPI Information Systems


Mark Bereit - 5/31/2006

David,

Thanks for your comments! I particularly liked your point about selecting language and tools before defining the end features.

I believe that the software development process needs some new tools, though, because even with a clearly defined end specification ("DVD player") the tools we have are all for building large algorithms. The Thing we need to define, as an industry, is what better tools would look like for robust, maintainable electronic systems. If we could do that I suspect that the means would start to come into focus.

Thank you for sharing your thoughts.

Mark Bereit


David A. Land - 6/1/2006

Mark -

I agree: we need tools which assure the integrity of INTERFACES...not just the PROCESS... so that useful, reusable modules can truly be created.

Some software thing akin to the pipe thread "NPT" or "ANSI" mechanical specs needs to be created for modules which accomplish commonly needed functions.

- David Land


Mark Bereit - 6/10/2006

David,

Trusted interfaces and standardized sub-components would be wonderful things. Another reader likened this need to the development of the 74xx series of TTL logic chips, which became such a "standard" approach to building low-level digital logic that to this day the follow-on logic families use the same part number roots for the same pin-outs, and even FPGA and CPLD design tools often mimic these components. This wasn't an ANSI or ISO committee job, just something that was so useful people used it and redefined their thought process around it.

I hope we get to see the same kind of worldview shift in software development...

Mark Bereit


David A. Land - 6/10/2006

Mark - I like the 74xx analogy....it was a "paradigm buster," too!

As I remember the time when TI gave birth to the 74xx, we had various vendors competing with each other to sell small RTL logic cards (DEC and many others). Each nominally performed similar functions, but the cards from one vendor would not fit the backplanes from another...and the RTL gates were electrically different enough from one another that you wouldn't try to mix cards from different vendors anyway.

TI did several things differently with the 74xx line. First, there was a family of logic gates in standard IC packages: they all worked together, so you didn't have to worry about circuit design...just the logic. And to allow TI to get large users and the military/industrial complex to standardize on their devices, they immediately cross-licensed all new designs to other unrelated vendors so they could make exact equivalents of the 74xx line (this is one main reason AMD was born): thus eliminating the "second source" barrier which is part of any long-life procurement cycle. The immediate effect was that manufacturers standardized on the 74xx line and required their organizations to purchase at least some parts from all of the 74xx "second sources" in order to keep them viable. This industry-wide second-sourcing forced further standardization and spurred development of many compatible IC's...kept prices down...and made newer MSI, LSI, etc. devices immediately useful. And another of the great benefits of second-sourcing was that if the primary supplier had a problem with a part, you could buy a "better" version of the same part from a second source...with no down time...most of the time you just plugged in the substitute and the production line never missed a beat. It was a bold, innovative, new, risky, and daring approach at the time...but it worked...and became the model used by the rest of the IC industry to this day.

So how can we replicate this success with software?

We need a broad family of functional modules from multiple sources which are designed to work together, have common interface methodology, and have very good and open documentation. Furthermore, we need to be able to put these modules together in a standard way which allows us to substitute a "better" module easily...to fix problems or add functions...preferably in the field.

Multiple-sourced software? This would require a re-think in the way software is developed and paid for. But it would be great for the industry, and would lead to many more viable new software companies in each specialty. Open specifications?...YES! Cross-licensed designs...GNU, OSF, or other...co-existing with proprietary (but open-specification) replacements...why not?

Field replaceable modules? This would require a re-think in the way software is structured and deployed. Installation methodology needs a much higher level of standardization and a drastic rework...something different from the one-program-at-a-time thinking...more like one function at a time...maybe closer to the dynamic linking loader idea...detailed configuration tracking like RPM does, but with hardware resource requirements also factored in.

And in order to make this work, Operating Systems have to actually manage resource allocation. They must "know" exactly what generic resources each module needs before it loads; and, under the primary operator's control, detect-and-flush any module which tries to seize more resources than it contracted for. This is what operating systems did before they abandoned their primary charter and degenerated into vehicles for forcing unwanted and "irreplaceable" applications and assorted spyware into your computer.

SUGGESTED GOALS FOR THE NEW PARADIGM:

We can have systems which absolutely cannot be caused to "crash" via software. We can have systems where the power fail-auto restart sequence works each and every time...and where applications come back to life "hot." We can have systems which are "fault tolerant" and simply shrug off many hardware failures. We can and should have systems which "gracefully degrade" under severe loading and which "gracefully upgrade" when software or hardware resources are added. We should have unimpaired access to our choice of the "best" software module for each function...and we should be able to mix and match modules from diverse sources in order to obtain the best possible system for each industry...each company...and/or each individual installation. All of these things are quite possible: they have been done before...on a smaller scale...long before the industry was seduced by the "computer on a chip"... and they can be done again - even better this time.

I hope I live to see the day when the desire for these benefits is so universal and so strong that the user community forces the requisite changes to occur.

David A. Land
LPI Information Systems