The RISKS Digest
Volume 5 Issue 67

Monday, 30th November 1987

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Aging air traffic computer fails again
Rodney Hoffman Alan Wexelblat
Computer Virus
Kenneth R. van Wyk via Jeffrey James Bryan Carpenter
Fiber optic tap
Kenneth R. Jongsma
A new and possibly risky use for computer chips
John Saponara
Selling Science [a review]
Peter J. Denning
Risks to computerised traffic control signs
Peter McMahon
Risks in Energy Management Systems
Anon

Aging air traffic computer fails again To: RISKS@csl.sri.com

Rodney Hoffman <Hoffman.es@Xerox.COM>
28 Nov 87 11:31:06 PST (Saturday)

In RISKS 4.48 (18 Feb. 87), I related how flights throughout Southern California were delayed due to the failure of the “9020” air traffic computer at the L.A. Air Route Traffic Control Center. Since the 9020 failed 12 times during the last six months of 1986, this story violates the masthead guidelines about being nonrepetitious. However, in the Feb. outage, it was reported that the 18-year-old system was “expected to be replaced later this year” [1987].

Following Murphy's Law, not only has the replacement not yet happened, but the system's latest failure was on one of the busiest travel days of the year — the Wednesday before Thanksgiving, when the passenger load was 40% more than on a normal weekday. Additionally, a bomb scare forced an emergency landing of one plane, further fouling flight schedules.

The computer failure, attributed by the L.A. Times to a “software problem” in the “massive IBM computer … that controls high-altitude air traffic for much of California, Arizona, Nevada, and Utah” lasted 4.5 hours, delayed over 140 flights from Southern California airports for 30 minutes to two hours. The L.A. Air Route Traffic Control Center is one of 20 such FAA regional facilities in the U.S.

Officials stress that the computer failure posed no danger to airline safety. Instead, it forced controllers to shift to a slower backup computer and “to carry printed information by hand, limiting the volum of traffic they can handle.” The news accounts made no mention this time of a date for installation of a replacement computer system. —Rodney Hoffman


Air traffic computer failure?

Alan Wexelblat <wex@MCC.COM>
Fri, 27 Nov 87 12:21:03 CST

COMPUTER BREAKDOWN SLOWS FLIGHTS IN WEST

Los Angeles(AP) - A five-hour air traffic control computer failure Wednesday [the day before Thanksgiving] stalled the holiday weekend getaway for thousands of Californians. The computer broke down at about 5:30AM and wasn't back in operation until about 10:30AM, said a spokesman for the Federal Aviation Administration. The computer in Palmdale, 60 miles north of downtown LA, routes air traffic for Southern California and sections of Nevada and Arizona. The cause of the failure was not immediately determined. The failure forced controllers to shift to a backup system that provides less information and the slowed operations, but no safety problems were encountered, the FAA spokesman said.

—Alan Wexelblat UUCP: {harvard, gatech, pyramid, &c.}!sally!im4u!milano!wex


Computer Virus To: risks@csl.sri.com

Jeffrey James Bryan Carpenter <JJC%Vms.Cis.Pittsburgh.Edu@VB.CC.CMU.EDU>
Wed, 25 Nov 87 11:15 EDT

From: IN%"MD4F@CMUCCVMA" "User Services List (ADVISE-L)" 23-NOV-1987 09:33 To: Jeff Carpenter <256521@vms.cis.pittsburgh.edu> Subj: Virus warning! Date: Mon, 23 Nov 87 08:05:57 EST From: “Kenneth R. van Wyk” <@vms.cis.pittsburgh.edu:LUKEN@LEHIIBM1.BITNET>

Last week, some of our student consultants discovered a virus program that's been spreading rapidly throughout Lehigh University. I thought I'd take a few minutes and warn as many of you as possible about this program since it has the chance of spreading much farther than just our University. We have no idea where the virus started, but some users have told me that other universities have recently had similar probems.

The virus: the virus itself is contained in the stack space of COMMAND.COM. When a pc is booted from an infected disk, all a user need do to spread the virus is to access another disk via TYPE, COPY, DIR, etc. If the other disk contains COMMAND.COM, the virus code is copied to the other disk. Then, a counter is incremented on the parent. When this counter reaches a value of 4, any and every disk in the PC is erased thoroughly. The boot tracks are nulled, as are the FAT tables, etc. All Norton's horses couldn't put it back together again… :-) This affects both floppy and hard disks. Meanwhile, the four children that were created go on to tell four friends, and then they tell four friends, and so on, and so on.

Detection: while this virus appears to be very well written, the author did leave behind a couple footprints. First, the write date of the command.com changes. Second, if there's a write protect tab on an uninfected disk, you will get a WRITE PROTECT ERROR… So, boot up from a suspected virus'd disk and access a write protected disk - if an error comes up, then you're sure. Note that the length of command.com does not get altered.

I urge anyone who comes in contact with publicly accessible (sp?) disks to periodically check their own disks. Also, exercise safe computing - always wear a write protect tab. :-)

This is not a joke. A large percentage of our public site disks has been gonged by this virus in the last couple days.

Kenneth R. van Wyk, User Services Senior Consultant, Lehigh University Computing Center (215)-758-4988 <LUKEN@LEHIIBM1.BITNET> <LUKEN@VAX1.CC.LEHIGH.EDU>

Fiber optic tap

<portal!cup.portal!Kenneth_R_Jongsma@Sun.COM>
Sat Nov 28 18:09:48 1987

Up until now, one of the prime advantages of fiber optic cable (aside from its capacity) has been its perceived resistance to being tapped by unauthorized parties. The Nov. 16th issue of EE Times had an interesting article that may change those perceptions.

EE Times reports that Plessey has developed a non-intrusive way of tapping fiber optic cable. The article states that Plessey's design concept has been tested with both high-speed digital as well as television signals. They don't go into details, but do say that the device clamps over an existing cable and bends it slightly. The small amount of light that is released from the cable can be detected and amplified.

They do acknowledge the fact that this causes problems for system security, but feel that the advantages (primarily in making it cheaper to use fiber as a cable tv medium) outweigh any disadvantages.

I find the implications of this development rather startling. They don't give a price for the device, but given that it is intended to be used as a cable tv line splitter, it can't be out of the reach of any individual.


A new and possibly risky use for computer chips

John Saponara
Mon, 30 Nov 87 12:19:28 EST

An interesting use of computer chips was mentioned in "The Christian Science Monitor" in the November 13, 1987 issue, in an article titled "Showdown takes shape over plastic weapons":

Plastic firearms - now being developed for the United States Army by Red Eye Arms Inc. - are a bone of contention between antigun organizations and the National Rifle Association (NRA).

[The article goes on to tell of the opponents, then continues:]

“One of the worst nightmares in our fight against terrorism is the possibility that airline hijackers could carry plastic guns aboard aircraft without detection,” says Senator Metzenbaum. “In order to prevent this frightening scenario, we need to act now.”

But David Conover of the NRA says: “Firearms constructed completely out of plastic don't exist now.” He argues that a ban on this “future” technology does not address the real problem of faulty airport security at the root of terrorist activity.

John Floren, president of Red Eye Arms - which holds the only patent to produce such weapons - says the prototype his firm is developing for the Army would be entirely plastic but would have computer chips implanted to make possible detection of make, model, and serial number from 15 feet away.

[The article goes on to describe the uses of plastic arms and various legislation concerned with them.]

The idea of adding chips to plastic guns, then selling chip detectors to all the airport security checkpoints, seems lucrative but does not strike me as the most sensible approach to the problem. I have not heard of the use of computer chips as a detection scheme before. How foolproof is such a scheme? If I stopped sending current to the chip, would the gun then not fire? Could I reprogram these chips to have false ID's from other guns, or to not transmit? Considering the problems the cellular phone companies have had with reprogrammed phone chips, there seem to be real possibilities of circumventing such measures. Does anyone know of any similar detection systems in use at present, and how secure they are?

Eric Haines


Selling Science [a review]

Peter J. Denning <pjd@riacs.edu>
Mon, 30 Nov 87 12:13:26 pst

Many RISKS readers have struggled with the question of generating a rational public debate when complex technical issues are involved. What follows is a review of an interesting little book chock full of insights into how science journalism works and interacts with the scientific research community. The author's comments on treatment of risks in the media are particularly interesting. I recommend the book highly. (pjd)

Selling Science, Dorothy Nelkin, W. H. Freeman, 1987, 224pp. A review by Peter J. Denning

Have you ever wondered about the apparent contradiction between hyperactive science journalism and extensive scientific illiteracy? Between the promotion of technology as the key to progress in society and the growing fear of technology? Between the demand for sophisticated science-based medicine and the widely supported objections to animal experiments? Between the rationality of science and expectations of “magic bullets” and “miracle cures”? Have you ever wondered whether you will be misrepresented if you talk to a journalist, whether having your research discussed in Newsweek is essential to your continued funding, or whether “popularizers” like Carl Sagan are advancing science? If you have, you are not alone.

These questions touch on fundamental issues in science, technology, and the press. Dorothy Nelkin has faced them head on in this fascinating book. With clarity and painstaking documentation she identifies four main characteristics of science journalism. First, most articles are high in imagery and metaphor, and low in technical content. Many of the images show science as arcane, esoteric, beyond normal understanding, authorative, trustworthy, pure, neutral, and the ultimate source of rationality and basic truth. High technology is touted as a quick fix to many problems and is the source of much disillusionment when it fails. On debates of great controversy, such as ozone, artificial sweeteners, or dioxin, or strategic defense, little technical information is given; instead equal time is given all “sides” no matter how irrational they might be from an objective standpoint. Second, much of science is portrayed as a series of dramatic events, rather than the slow, backtracking, plodding process it really is. Third, there is a strong emphasis on competition—e.g., the “race” for breakthroughs, the obsessive 90-hour workweeks of Nobel laureates, or the “technology war” between the United States and Japan. Fourth, scientists have been actively involved in the press; far from being neutral sources, they have sought favorable coverage of their projects. Many institutions have active media programs that have successfully put the most favorable information out for public consumption, professional societies have advanced proposals to control the flow of information to the press, and some journals by policy refuse to publish any finding that has been “scooped” in the public press. In the midst of this, scientists have ambivalent attitudes toward the press, at some times seeking it out, at others criticizing it.

These trends emerge from Nelkin's careful analysis of a large number of scientifc articles published over many years, and they correlate well with one's own experience. But the real contribution of the book lies in its careful, and highly successful attempt to understand the frames of reference — the mindsets—of scientists and of journalists. Nelkin accurately describes the style of scientific research, the norms of objectivity (especially peer review and reproducibility of results), the professional ideals, the role of technical jargon, and the rules of evidence widely used in science—in short, the unspoken culture in which all scientists operate. She similarly describes the culture of journalism, including basic reporting, editorial contraints, audience assumptions, economic pressures, avoidance of complexity, and vulnerability to sources. From this it becomes easy to appreciate the sources of misunderstandings between scientists and journalists. When a scientist says there is no (statistically significant) evidence of a correlation between power-plant radiation and cancer, a journalist who knows of a few cases of radiation-induced cancer may “hear” a coverup; when a scientist says a new drug produced an improvement in a few AIDS patients, a journalist may “hear” that a cure is imminent. When a journalist asks probing questions about risks of technology, a scientist may “hear” that the journalist is trying to make the evidence fit his own hidden agenda; when a journalist omits important methodological details about an experiment, a scientist may “hear” an attempt to oversell a finding to a gullible public.

Nelkin praises efforts to increase mutual cultural understanding between scientists and journalists, such as formal science training for science journalists, the Council for Advancement of Science Writers, and the Media Resource Service of the Scientists Institute for Public Information. Although the tension can be softened, she says, the two cultures are inherently different and the tension cannot be wholly eliminated. Scientists and journalists will have to come to terms with an uneasy, occasionally adversarial relationship.


Risks to computerised traffic control signs

Peter McMahon <munnari!uqcspe.cs.uq.oz.au!pete@uunet.UU.NET>
Mon, 30 Nov 87 13:07:50 est

Quoted from “Computing Australia”, without permission. [23 Nov 87]

“Computer-run signs over Canterbury Road in Melbourne's eastern suburbs suggested a speed of 75 km/h - in a 60 km/h zone - for a clear run through traffic lights.

Despite the anger of police the mix-up was not solved for three days.

The 26 electronic signs are part of a new information system devised by the Road Traffic Authority (RTA) […]

[An obviously upset!] Chief Superintendent Frank Green of the Victorian Police traffic department said: ‘If these mongrel machines are telling people to breach the bloody law we'll have to tell the RTA to take its computer and shove it.’

RTA and Mach Systems officials denied any bungled programming and said that only two signs were malfunctioning.“

Peter McMahon, Computer Science, University of Queensland, Australia pete@uqcspe.cs.uq.oz.au


Risks in Energy Management Systems From: Anon @ uk academic establishment

<>
26 Nov 87 16:07:33 GMT (Thursday)

A while back someone asked about risks in Energy Management Systems; well, here's one that I know of (even though it isn't related to computer rooms, unlike the original discussion)….

Some time ago I spent about nine months working for a company which produces and sells a computer controlled distributed energy management system. It consists of outstations (which can work stand-alone) and a central operators station. The centre was running circa 10000 lines of uncommented, undocumented spaghetti BASIC which had evolved over a period of years in the care of a couple of self-taught programmers. The outstations contained a few thousand lines of commented but undocumented assembler. Communication was by reading & writing of memory in the outstation using explicit memory addresses embedded in the centre software.

[My job was supposed to be to help them do a complete redesign & reimplementation, but that was ‘temporarily’ shelved when the Managing Director realised what it would cost (amongst other reasons)]

The standard mechanism for fixing a bug involved someone trying to replicate it, hacking a fix into a copy of the user's version of the program & then sending it back to them. No version control, no replication of fixes across the user base etc. Testing involved installation on-site & waiting a day or two to see if things broke. The only in-house tests were done by running new software in the outstation that controlled the company HQ : I've seen clients sitting in our reception area in hat & coats on a Monday morning (waiting to see someone) because the heating hadn't switched itself on (there was a warm-up period so manual over-rides at 8am didn't have much effect until after 10) and as a result in-house tests were strongly discouraged!

Not surprisingly, crashes were common — especially of the ‘centre’ — which was embarrassing since the system — originally designed for control of heating plant in schools & factory complexes (eg one centre for an education authority, one outstation per school) — was also being used as heating control for communal housing projects.

However, the worst incident was in fact hardware related. UK mains voltage is 240V +/- a maximum percentage. The outstations contained a transformer which would drop out (powering down the system) if the voltage dropped more then a couple of volts below the (supposedly) absolute minimum, it would only trip in again at nearly 240V (quite a lot of volts above this minimum). Last winter was quite severe over here and there was some strain on the electricity supply system so it ran at near minimum for extended periods (several hours at a time) with occasional glitches down to the absolute minimum and (we suspect) sometimes slightly lower. Now, when an outstation stopped it halted all of the pumps, boilers etc that it controlled; ie no heating was provided.

Because of the way power was supplied to an outstation in a communal housing project it was receiving a couple of volts below the actual mains voltage and one evening it tripped out. It didn't trip in again until a LOT of hours later, by which time all of the housing involved had got very cold. The accommodation included some sheltered housing for elderly people one of whom ended up requiring hospital treatment for hypothermia. Fortunately for the company nobody sued and the woman involved recovered (remember that hospital treatment over here is free so there were no medical bills involved).

This sort of problem was far from unique, it's just that the people who are buying these systems often don't realise the potential dangers and some manufacturers are so busy trying to grab a share of the market that they rush bad &/or untested products to the buyers.

Please report problems with the web pages to the maintainer

x
Top