Friday, 10 September 2004

Worst. Idea. Ever.

File this one, in sadness and mourning, under Software Development.

Worst Idea Ever.

Two quotes:
Acting as spokesman for the concerned engineers Gerald Wilson compiled a 50 page dossier detailing the unsuitability of Windows as a foundation for a naval command system, and arguing that BAE's Unix history and expertise made open source UN*X a logical and viable way forward. The company then made him redundant.

In April 2002, Bill Gates, acting as Microsoft's Chief Software Architect, gave extensive testimony under oath to the US Courts. Gates's testimony included description of the current structure of Microsoft Windows. Snubbing fifty years of progress in computer science, the current structure of Windows abandoned the accepted principles of modular design and reverted instead to the, much deprecated, entangled monolithic approach. Paragraphs 207 to 223 are particularly revealing about Microsoft's chosen approach (paragraph 216 is difficult to believe!). Anyone with elementary knowledge of computer science can see that Microsoft Windows, as described here by Gates, is inherently insecure by design. If this is a flagship Operating System, then Dijkstra's life was in vain.

And Paragraph 216 (Bill Gates testifying under oath) :
"In a purely theoretical world, one could imagine developing modest software programs in such a way that any module could be swapped out in favor of a similar module developed by a third party. The replacement module would need to conform identically to the interfaces expected by all of the modules with which it interacts. In the commercial world, it is hard to see what value such replace-ability would provide even if it could be achieved."

For those readers who aren't interested in software engineering and use a windows-based computer, imagine the following: Your computer has a line from the back of it, connected to your chair. If your computer crashes, you get full mains voltage applied.

I've actually helped make systems so reliable that I'd be content if the above example wasn't fictional. Hundreds of people's lives depend on some of my, and others, software working first time, every time. In case you haven't noticed, it's unlikely the computer you're using to read this is quite so reliable. Even if it is a Mac, or Linux box.

If the Register article above is accurate, then it looks like engineers have been over-ruled by suits. As happened with the Columbia, and the Challenger. Hopefully the deaths that will result will only be numbered in the hundreds - because some of the ships in question have nukes on board.


Ancient_Hacker said...

Windows NT/XP vs Unix, modularity thereof.... Hmmmm.......

Opinions will vary, but in many ways, Windows NT/XP is very stable and modular:

* I've been running Windows XP for 2+ years now, on about 12 PC's running 24/7. I've gotten exactly one Blue Screen Of Death, and that was when I accidentally yanked out an ethernet card out of its PCI socket. Our one Linux server has had similar reliability.

* Windows NT from day one was designed in an extremely modular fashion. There is a Hardware Abstraction Layer that is the one and only interface to the raw unvirtualized hardware. There is another module on top of that which provides basic system services thru a very strict server protocol. The system layers above that are all contained in very modular DLL's, all of which communicate thru explicitly named entry points and named memory segments. True, the layering is less than optimal, but that can't be helped in a once-off first-time design such as this.

* On the other hand, Unix originally came with a monolithic kernel, one linear swath of system code, all linked together, not a hint of modularity. Various offshoots have added means of modularizing the devicde drivers and system services, not all of them in pleasant or palatable ways.

* The modularity of NT/XP is quite robust and flexible. It's quite possible to (with enough permissions) to add or replace system modules. This is both good (as a sign of good modularity) and bad (if some virus gets enough permissions to replace a system module).

* In both Windows and Unix many of the modules have well-designed interfaces. Others, such as the multimedia ones in NT and the IOCTL() in Unix, have really terrible interfaces. A lot of the code seems to have been designed and written at 3AM to meet a 8AM deadline. This is not unusual in the real world.

* Almost all of the shaky elements of Windows NT/XP can be turned off by a few mouse clicks. Turn off most of the services in "msconfig" that listen on various ports to provide the many proprietary, useless and slow windows services (WMI, Telephony, net DCOM, Telnet, Web server, many more...). Remove Internet Explorer or at least turn off its willingness to run ActiveX components. Remove Outlook, or at least turn off its eagerness to show mail as HTML.

* Overall I'd give Unix a slight edge in reliability, your opinion may vary. No, I'm not interested in a flaming "best OS" war. Both systems have strong and weak points. When I want to type or browse the Web, I use WinXP. If I ever wanted to run a big web site or serve a lot of files, I'd probably use Linux. I can't see typing in Linux/OpenOffice to be any better than using WinXP/MSWord; I can't see using WinXP for heavy duty serving either.

* BTW you can have BOTH, I have RedHat Linux and WinXP and Win 98 and MSDOS all running under VMWare, all at the same time, all quite happy to do so for days on end.

* Dunno if I'd like nuclear launches controlled by any of them!



Zoe Brain said...

OK, no religious arguments about the virtues or otherwise of various OSs.

One thing though: When I first started getting involved with Unix in 1977, I said 'This is THE Operating System for the 1980s!'.
I stand by that remark today, in 2004. Unix is the OS for the 1980s, and it's a scandal and a shame that there has been no radical improvement on it. Linux is an improvement, although I favour FreeBSD myself, but it's not a radical one.

The hardware abstraction layer of NT is good. The 'basic services' one isn't bad, but I'd like to see more formal testing before I'd pronounce it good enough to bet my life on.

Personally, I'm all in favour of Naval Combat Systems as bare machines. My experience has been that you can write a nice, tight, efficient and formally-provable kernel with only the capabilities you want, and nothing else that can be exploitable, in a very short time. Your main problems with this approach are that hardware very often doesn't meet it's own specs, a CPU with one stepping of one version will be different from another stepping of the same version. But at least you find these problems soon in the development cycle.
Languages such as Ada-95, and the formally-provable SPARK subset of Ada, to a great degree contain their own OS built-in, or at least, enough of one for many purposes. Tasking (threads), basic I/O, that type of thing.
For example, on FedSat, we used the RTEMS kernel (POSIX woouldn't hack it), then just put Ada-95 on top. This handled memory management, paging, interfacing between IEEE and 1750A floats, and it all works, despite getting zapped by the radiation in the SMA 6 times per day.
It's not particularly difficult (compared with, say, the problems in trying to get a C++ program compiled in Code Warrior 7 on a Mac to port to MSVC++ 5.0 on a Wintel box).

Appropriate Software Development techniques for things like Nuclear Power Plants, and Naval Combat Systems, and Heart Pacemakers are fundamentally different from ones appropriate for Doom VII, IE 9, Oracle 15, or even TaxPak. I'd no more attempt to have every program running on a Wintel Box as a turnkey bare-machine than I would have Win2K controlling a submarine's depth.