Software Development Paradigm Trap main page

Further Reading

Feedback: Software Development Paradigm Trap

Steve Fairhead - 8/5/2006

I read your article. I agree strongly with the premise (decoupling), but disagree re the conclusion (more CPUs).

I have a particular style in designing software. It involves aiming for the simplest possible solutions. I fervently believe in a design approach that allows me/you to read the code, be able to understand it, and conclude that all the elements are boringly trivial. Nothing clever, just a hierarchy of simple things. The trick is as you describe - to partition correctly, whether in software or in hardware.

Hardware costs money, for every product shifted. My background is in medium-volume manufacture (10s of thousands/annum). There is always pressure to design down to a cost.

But "good" software design costs nothing - in fact, it saves money. Barring typos, correct design tends to work first time. It survives maintenance. It is scaleable. It is understandable. It works. There is no debugging - an activity I actively hold in contempt. Debugging (beyond typos) represents a mistake. I design to avoid such mistakes, by making things simple.

For sure, at the hardware design stage there are trade-offs. Sometimes a multi-CPU design is more cost-effective than an I/O-intensive single-CPU alternative. But - the basic principle still applies, even when a single CPU is the better choice.

For me, the key is synchronism. Hardware folks learned long ago that synchronous logic was more reliable - due to fewer variables - than asynchronous logic. So many times I've seen "complexity" used as an excuse for the use of a preemptive multitasker RTOS - rarely have I seen this work well. My choice would always be a cooperative multitasker, where each task communicates with the other synchronously. It takes more discipline ("Thou Shalt Not Pend"), but simplifies things. Where a team of designers is involved, it takes more leadership. I requires more precision in the design of the interfaces, and an absolute ban on globals. Such things tend to make more reliable products.

But fundamentally, I'm with you. A collection of simple things, with clearly defined interfaces, is more likely to work long-term than the alternative.


Steve Fairhead - SFD - Solutions by Design

Mark Bereit - 8/6/2006


Thanks for your comments. And if we agree to disagree about a need for multiple processors, I'm glad we can agree on the need to divide projects into small, well-designed components.

I see multiple processors as an extreme form of the discipline, where it simply is not possible to throw pointers around because there is no memory space in common between components. I also see that the ongoing pressure for more compute power is pushing the makers of microprocessors into more cores sharing the same memory space, increasingly badly, so at some point the shift to cores not sharing the same resource bottlenecks could become essential.

But even if these approaches never come to be widely viable, a design process of breaking the project into well designed, maintainable and inherently reusable elements is very important. As an industry, we've still got a long way to go on these goals.