The Associated Press reports in 9/29 from Hartford, Conn. that for the last three years Hartford residents have been excluded from the federal grand jury pool. The problem was discovered in a lawsuit disputing the racial composition of the federal grand jury that indicted a minority defendant for bail-jumping. Apparently, the city name for Hartford residents had been typed in the wrong place (wrong field?) in computer records, with the effect that the "d" in "Hartford" overflowed into a status field, indicating the named person as deceased. Fernando Pereira, 2D-447, AT&T Bell Laboratories, 600 Mountain Ave, PO Box 636 Murray Hill, NJ 07974-0636 email@example.com [Noted by others as well... PGN]
My union just sent out a letter to its members containing (among other things), the note: "If your salary is less than 11844 or greater than 32767, please notify us immediately and we will adjust your membership fee." The membership fees are dependent on the salary. My salary is higher than the first figure quoted and I already have the maximum fee. So why the request to people with a salary greater than 32767 swedish crowns? Since 32767 is the greatest integer you can represent in 16 bits with 2's complement arithmetic, I am willing to bet that their computer misrepresents larger salaries. I guess that someone with an income of 33000 crowns (say) are charged the fee for an income of 33000-32767=233 crowns! That would be the minimum fee... Someone must have noted this, and now they must correct those cases manually... Lars-Henrik Eriksson, Swedish Institute of Computer Science, Box 1263 S-164 28 KISTA, SWEDEN Phone (intn'l): +46 8 752 15 09
The incident _was_ caused by a computer. An automated system was "in the loop" that examined sensor inputs (including the control for the thrust reversers and brakes (and in the other incident, the flaps), and the indication given by the "squat" switch, and decided whether or not to deploy the thrust reversers and the landing wheel brakes. It seems odd to me (but then again, I'm not an engineer designing commercial aircraft) that the squat switch should _also_ disable the brakes, which don't seem like they pose a safety hazard if used during flight. This seems like an area where a cockpit crew needs to be able to explicitly override the safety system. I can imagine a sort-of dialog (not with spoken or even typed words, but by command actions - pulling levers and switches, and so on...) Crew: deploy thrust reversers Safety Systems: No. Use of thrust reversers in flight will destroy the aircraft. Crew: Acknowledged. Override safety constraints! Deploy Thrust Reversers NOW! [...] We still put human crews in airliners. Maybe the next step is to admit that the safety systems are fallible, and give crews a way to overcome that fallibility. There are numerous design issues involved, though. The safety systems are there for a purpose, and the bypass mechanism should have enough restrictions that it is only used in an emergency, rather than as a way of avoiding a routine bother. On the other hand, if the restrictions are too severe, the crew will be unable to override the systems when they have to. At around 200 MPH, a jetliner runs out of runway _very_ quickly, and there isn't a lot of time for access codes, or even synchronizing movements of two crew members. Perhaps requiring a lot of paperwork _after_ the use of the override system would be appropriate.
Some papers run personals among the classified ads. The New York Review of Books runs not only personals, but, right above them, therapy ads too. This appeared in the October 8, 1992, edition: FEELING HELPLESS ABOUT DEPRESSION? Overcoming Depression 2.0 provides computer based cognitive therapy for depression with therapeutic dialogue in everyday language. Created by Kenneth Mark Colby, M.D., Professor of Psychiatry and Biobehavioural Sciences, Emeritus, UCLA. Personal Version ($199), Professional version ($499). Malibu Artificial Intelligence Works, 25307 Malibu Rd, CA 90265. 1-800-497-6889. The risks, to coin a phrase, are obvious. If anyone who happens to live in the 'States followed this up, I would be fascinated to know what exactly this thing is. Sean Matthews, Max-Planck-Institut fuer Informatik, Im Stadtwald, W-6600 Saarbruecken, Germany +49 681 302 5363 (firstname.lastname@example.org)
With regard to garage door opener security, I recently was asked to inspect the malfunctioning garage door opener transmitter for a friend's mother. I used a screwdriver to open it up, and found a broken battery wire. The unit included a microcomputer and a DIP switch for a 12-bit password. I don't think I'd be revealing any great secret to tell you what her password was. It was the binary number 000000000001. Mark Thorson (email@example.com)
Leslie J. Somos: >I can understand preventing deployment of spoilers or thrust reversers while in >the air, but I don't understand preventing brake application. A lot of the replies have missed a pretty fundamental component of this problem: the one of the increasing design modality of airliner systems. *Landing* models, *Ground* models, *Takeoff* models, *Flight* models, ad nauseum. Conditional logic being used to disable systems or alter the behavior of control devices to fit the projected use in a specified mode. We have had squat switches for years. They're useful. The problem arises, as I see it, when they provide an online datum for evaluation and use by client devices in a highly abstract *design* context. The brake question is one such example. The brakes weren't enabled because it made no sense to enable them, from the perspective of the cockpit control logic. It's a "tidiness" that makes for clean block diagrams, but in many ways, lends a higher level of complexity to a system interface. In a conventional interface, the pilot would be able to massage the brakes to his heart's content, in the air, gear stowed, or whenever. This may not make much LOGICAL sense, though, so the feature's *turned off* in the air... It's yet another manifestation of the conflict of old-fashioned "open-loop" design, vs. "modern" "consider-all-cases" (and hope we got it right!) design. Larry Seiler: >I agree that it is better to have an occasional accident due to a safety >interlock system that fails than to have more accidents due to people >accidentally doing fatal things like engaging the thrust reversers I don't see this as an "override" issue. We need to differentiate between items that can cause disasters, and items that don't fit an abstract design model. A failure of thrust-reverser safety interlocks can kill an airplane, as the Lauda crash showed. "Modality" logic in the case of the brakes makes very, very little sense, however-- it's likely, as Fokker learned, that the modality *decreased* the safety margin, with *no* increase in safety in a properly-functioning system, anyway! Technology for technology's sake, once again. Electronic toilets, anyone? :-) [Big story on the French electronic toilets in New York in this weekend's papers! PGN] Robert Dorsett firstname.lastname@example.org ...cs.utexas.edu!cactus.org!rdd
The fictional story of a book club hassle escalating to a capital case via compounded system errors: "Computers Don't Argue" by Gordon R. Dickson, ANALOG SF&SF Magazine, September 1965, pp.84-94. Reprinted (as of 1976): ANALOG 5, 1967, Doubleday, J. W. Campbell, ed. ASTOUNDING ANALOG READER, Vol 2, 1973, Doubleday, Harrison and Aldiss, eds. NEBULA AWARD STORIES, 1966, Doubleday, D. Knight, ed. TRANSFORMATIONS II: UNDERSTANDING AMERICAN HISTORY THROUGH SCIENCE FICTION, 1974, Fawcett Crest Books, D. Roselle, ed. WONDERMAKERS 2, 1972, Fantasy Premier Books, R. Hoskins, ed. It may have been reprinted in DATAMATION, too. (But the above citations are from William G. Contento's INDEX TO SCIENCE FICTION ANTHOLOGIES AND COLLECTIONS, 1976.) It's a great story (though by now more an archetypal contribution to RISKS than science fiction). -- Vernor Vinge, email@example.com
The Story of Escalating Computer Mistakes entitled "Computers Don't Argue" by Gordon R. Dickson appears in "Computer Crimes and Capers" edited by Isaac Asimov, Martin H. Greenburg and Charles G. Waugh, ISBN 0-14-007310-8 (British Edition, Published by Penguin Books), according to the title page the copyright was made by Conde Nast Publications in 1965. This book is recommended to all risks readers for the inclusion of two stories which highlight risks related issues. The first is "An End of Spinach" by Stan Dryer (which also appeared in the Magazine of Fantasy and Science Fiction and carries a copyright of 1981) and the second is "Sam Hall" by Poul Anderson (copyright 1953 by Conde Nast Publications). Having just flipped through the volume again I can also recommend "While-U-Wait" by Edward Wellen (copyright 1978, Magazine of Fantasy and Science Fiction). "Computers Don't Argue" also appears in "The Best of Creative Computing Vol Two" edited by David Ahl. But it is likely that this is now out of print, however it does mention that the story originally appeared in the magazine "Analog". It is interesting to note that many risks mentioned in this forum were considered by SF writers in the fifties and sixties...
> The classic treatment of the "computer-induced" nightmare through > "minor" errors must be the humorous (fictional) piece done by > "Datamation" in the early 70's. Another classic story of computer automation gone overboard is "Brazil", by Terry Gilliam. Makes what we read about here seem like a day at the park :-) Marc
Please report problems with the web pages to the maintainer