The RISKS Digest
Volume 7 Issue 39

Wednesday, 24th August 1988

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Computers and Gambling
George Michaelson
Car engines become target for hackers
George Michaelson
Vincennes and Non-Computer Verification
David Collier-Brown
Shades of War Games
Doug Mosher
Emissions testing risk
Levy
Re: British vs. American safety rules
Jon Jacky
Re: Structural analysis programs
Stephen D. Crocker
Re: Danger of Sensitive Car Electronics
Will Martin
Info on RISKS (comp.risks)

Computers and Gambling

George Michaelson <munnari!ditmela.oz.au!G.Michaelson@uunet.UU.NET>
Wed, 24 Aug 88 13:22:23 +1000
  "GAME BANDITS FOIL POLICE" (`Australian' computers section, 23 August 1988)

Victorian & N.S.W Police are noting an increased use of "bent"
one-arm-bandits used to finance drug & other big-money crime.
The head of the racing and gaming squad reported machines that:

 "..appear to run legitimate amusement games but with the flick 
 of a switch they are converted to gambling machines.

 Machines of greater sophistication are now starting to appear
 with a second switch that totally erases the computer program
 [sic] which runs the illegal games.

 If that happens we are powerless to prosecute."

George Michaelson
CSIRO Division of Information Technology 55 Barry St, Carlton, Vic 3053 Oz


Car engines become target for hackers

George Michaelson <munnari!ditmela.oz.au!G.Michaelson@uunet.UU.NET>
Wed, 24 Aug 88 13:34:54 +1000
Extracts from the australian  of 23/8/88:

Computer hackers have found a way to make cars go faster.  They are breaking
into tiny engine management computers and reprogramming them to give family
cars extra acceleration.

One large motor company is about to fit a secret detector "to foil hackers and
safeguard used car buyers from being duped".

Computer tweaking can seriously damage a turbocharged engine, particularly if
it is driven on "full boost:  over a long stretch of road; and there are
"growing signs of fraudulent warranty claims" an industry official says.

It costs about $800 to have the engine computer tweaked by specialist
hacking-tuning companies.  There are alternative go-faster computer chips ready
to be installed while you wait.  Customers include owners of every
computer-controlled car in the book.  Computer buffs can do themselves for
about $20.

Motor companies are now planning to guard themselves against warranty claims
that might arise through unauthorised computer up-rating resulting in engine
failure.  The new electronic detector will record a cars computer instructions
when the vehicle is brought in for breakdowns to be repaired under warranty.
False instructions will be detected and the owner told the change has
invalidated the guarantee.

Insurers say the unauthorised modifications will invalidate the insurance
policy.

George Michaelson, CSIRO Division of Information Technology

     [Early issues of risks have addressed this problem in the USA.  For
     example, RISKS-4.12 noted the reprogramming of an 1984 Firebird.  PGN]


Vincennes and Non-Computer Verification [CFeehrer RISKS-7.38 (and 33)]

David Collier-Brown <daveb@geac>
Wed, 24 Aug 88 10:12:23 EDT
  One point that I did not see in the discussion to date (which CFeehrer
inadvertently reminded me of), was the peculiar position of the Captain of the
Vincennes:  He had no independent means of verifying the correctness of the
information his battle-management system was using.
  He was faced with making a decision with doubtfull information,
information which did not become "better" over time, and before a
certain time, after which it would be too late to defend himself
against an aircraft and he would have the greater risk of defending
himself against a missle.
  Not a good situation to be in.

  Yet it is a perfectly "ordinary" consideration in war to try to
verify the poor, incomplete and fragmented information which a
commander has to deal with.  If your air force says an enemy column
is near, you send out a scouting force to verify that this really is
th enemy you're worried about (and to slow it down if it is).  If
your radar says a bear bomber is approaching the coast of Labrador,
you send out aircraft to have a look, and hopefully send the bear on
its way.
  In the case of the Vincennes, where was the air patrols which
should have accompanied any major combat vessel?  Where was the
captain's eyes and first line of defense against air attack?
  Even in peacekeeping operations, one tries to have sufficient
support arms nearby and available on a moment's notice (SOS tasks if
they're artillery, combat air patrols if they're air force, etc.)

  Indeed, **what happened** in the case of the Vincennes?  Was the
U.S.operating naval patrols in a war zone without air support?  If
so, why?

  What kind of faith are we placing in electronics if we send major
vessels out "alone" into war zones, even on non-warlike missions,
without providing them with mobile forces to identify and hopefully
deal with air attacks, and nearby support forces for backup?

David Collier-Brown, 78 Hillcrest Ave, Willowdale, Ontario
{yunexus,utgpu}!geac!lethe!dave


Shades of War Games (Source passwords)

<SPGDCM%UCBCMSA.Berkeley.EDU@jade.Berkeley.EDU>
Tue, 23 Aug 88 16:45:27 PDT
I just received a mailed promotional piece from The Source, a well-known info
network. It included an application form.

To my astonishment, it included a blank for "Mother's Maiden Name (for
verification of ID or Password if lost):_______________________________".

Good Heavens! This implies that a publically obtainable, permanent fact about
me could be used to obtain my password, or equivalently, that this factoid IS
in effect my password regardless of what I select. Need I say more?

      [Surprised?  PGN]


Emissions testing risk

<att!ttrdc!levy@ucbvax.Berkeley.EDU>
Tue, 23 Aug 88 00:16:20 PDT
In the Chicago area, for the last two years, there has been a program of auto
emissions testing, administered by the IEPA (Illinois Environmental Protection
Agency).  Auto, truck, and van owners here are required by law, upon mailed
notice, to bring their vehicles annually to special testing stations.  At a
testing station, an attendant puts a sensor probe in a vehicle's tailpipe; the
driver idles the engine at low and high speeds, and the machinery checks the
pollutant emissions.  When the test is finished, the attendant keys in the
car's vehicle identification number (read from a pre-test windshield sticker
that is peeled off) and the results are recorded in an IEPA computer database.

If a notified vehicle owner neglects to have the vehicle tested (or fails three
tests in a month and does not qualify for a waiver by having certain required
service done on the vehicle) the owner will be sent a series of followup
notices over several weeks.  If these are not heeded, the vehicle owner's
driver's license is suspended and this fact recorded in a police data base.
The owner becomes liable to arrest should he or she drive while the license is
suspended and be ticketed or stopped for even a minor traffic offense.

Personally, I believe in emissions control and emissions testing (how could I
say otherwise after the stand I recently took in rec.autos? :-) but, of course,
the computerization of the system is liable to the usual snafus we all know and
love to hate.  As printed in the Sunday, August 21 Chicago Tribune headline
article, "100,000 drivers risk jail under clean-air law" (almost all of these
are because of genuine negligence to obey the law, rather than computer
foulups, but this is still pertinent to RISKS):

  ... Joel Aaseby of Elmhurst ... was issued a traffic ticket following a
  minor accident in Elmhurst.  "They ran my name through the police computer,
  and there was the notice of my suspended license," Aaseby said.  "... I had
  to go down to the station under arrest and fill out a bond card."

  What made Aaseby's experience upsetting for him was that he had passed the
  auto emissions test, but in the process one of the testing clerks punched the
  wrong vehicle identification number into the computer.  Aaseby got several
  more notices that he had failed the tests and that his license was about to
  be suspended.  "Numerous times I mailed them [the secretary of state's
  office] all the documentation that I had passed, but never heard back until I
  was arrested," he said.

  "Now I have a court date, and I will take my documentation to court and I'm
  confident the judge will understand, but do you have any idea how much
  aggravation I have gone through, especially since the news- paper listings of
  arrests has my name on the top of the list this week for driving with a
  suspended license?" he said.

  "We have heard these kinds of stories," [Cinda Schien, Illinois EPA
  spokeswoman] said.  "But hopefully we have got [sic] rid of these problems.
  We think we are fortunate we haven't had more.  "Remember, we are dealing
  with 2.5 million cars and numerous mailings for each one.  We basically had
  to invent the wheel for the first couple of years," she said.

What bothers me the most about this is not the foul-ups themselves (there are
almost bound to be problems in any massive endeavor of this sort) but rather
the apparent refusal of the people administering this system (and the
associated law enforcement system) to take full responsibility for foulups when
they do happen, and their apparent propensity to believe they're not really
happening.  Why else, for example, would Schien have spoken of "hav[ing] heard
these kind of stories" instead of something like "being aware of occurrences
like this"?  Do I smell some bureaucratic CYA here?


Re: British vs. American safety rules

Jon Jacky <jon@june.cs.washington.edu>
Tue, 23 Aug 88 10:02:20 PDT
<> (I reported) (British Ministry of Defense) regulations explicitly
<> prohibit any cost saving that might increase hazard to life — you
<> are not allowed to trade lives off against money
>
> (Henry Spencer replied) Britons should not consider this reassuring;
> it is, in fact, alarming.  This rule was obviously written by a politician
> ... not by an experienced engineer

Cullyer said that he himself had a hand in drafting some of these MOD regs,
although he didn't go into detail about who wrote what at the level of chapter
and verse.  Whoever was responsible, Cullyer was clearly approving of this
particular requirement.  He is certainly an "experienced engineer" - - and
evidently a good one; several people at the conference (including our
moderator, Peter Neumann) said the VIPER work appeared to be the greatest
practical achievement yet made in the formal verification field.

> (Henry asks) is this rule really serving the intended purpose?

I should emphasize that Cullyer was not reading verbatim from the rulebook -
this was his verbal paraphrase of the intent of the regulations.  The point
Henry makes - that you can ALWAYS propose some change that appears to increase
the safety but also increases the cost - does require a response.  I can only
say that Cullyer and his group appeared sharp enough so that I assume this was
dealt with in some reasonable way.  I gathered from the context of what he was
saying that the intent of the rules was that shortcuts should not be taken 
to meet schedules or budgets if these resulted in violations of accepted 
standards and practices - and that these standards should be kept high.

> What this rule is doing is guaranteeing that the risks and tradeoffs
> will not be discussed openly!

In fact Cullyer, who is an employee if the British MOD, gave a franker 
description at a public meeting of software problems within his organization 
than I have ever heard in public from any employee of the United States 
Department of Defense (or any other public or private organization, for that
matter).  I am not aware of any United States studies like 
the NATO software study, in which samples of software from the field were 
systematically examined for latent faults, prior to any accidents - and 
then the results were reported at a meeting in a foreign country.  I 
emphasize that Cullyer's presentation did not have the connotations of
aggrieved victimization usually associated with "whistleblowing" in this
country - rather the RSRE group and the MOD administration were engaged in
a constructive effort to be as thorough as possible.

> (Henry suggests we) consider what is likely to happen.  ...  Manufacturers
> will privately evaluate the tradeoffs - the obvious thing to is to start out
> with the cheapest and *most hazardous* version.

I understood Cullyer's larger point about the difference between the two
countries to be that the climate in Britain is such that the pressures
encouraging this kind of adversarial legalistic maneuvering are somewhat less
than in the US.  I would add my personal observation that if Cullyer is any
example, British civil servants are afforded higher status and respect than
many of their counterparts in the United States, therefore manufacturers
realize that the reviewers who evaluate their products may be at least as smart
as they are, and will recognize ploys such as the one Henry describes.

I think Cullyer's point regarding the regulations was that it is possible to do
a better job at designing safe computer-controlled equipment than is now common
practice.  It need not even be all that much more expensive, but people do not
do it because they do not know how, or are unwilling to expend any additional
effort.  He noted that it took a certain amount of goading from people in
authority to make the point that safety should be an important design goal, and
that a good way to do that was to announce that products found wanting will be
rejected.

I also understood Cullyer to mean that when designing things like fly-by-wire
aircraft, nuclear weapons fuses, and nuclear power plants, it was necessary
that a degree of goodwill and cooperation exist among all parties, and that
excessive attention to cost-cutting would just get everyone into big trouble;
if the budgetary pressures are all that pressing, the project should perhaps
not be undertaken.

> It is *always* necessary to trade lives off against money, because there
> is no limit to the money that can be spent adding 9's to the end of
> 99.9999% reliability

Interestingly, another of Cullyer's points was that statistical reliability
estimates with lots of 9's in them were not meaningful, and he did take
personal credit for writing the MOD regulations that say that such claims will
not be regarded seriously! Actually several speakers at COMPASS made this
point.  It is discussed at some length in the report that will appear in the
October 88 ACM SOFTWARE ENGINEERING NOTES, and I will post an excerpt to RISKS.

- Jonathan Jacky, University of Washington


Re: Structural analysis programs (RISKS-7.38)

Stephen D. Crocker <crocker@tis-w.arpa>
Mon, 22 Aug 88 17:57:51 PDT
Alan Kaminsky suggests detailed structural analysis programs, e.g. NASTRAN,
are making it possible for engineers to justify smaller safety margins, and
he asks in various ways if this represents a RISK.

This reminded me that current practice for design and implemention of
software  has a complementary RISK.  It is quite common for computer
system designers to select hardware that has more memory and speed than
they think they're going to need.  "Safety factors" of two to four are
often considered reasonable if they don't increase the cost too much.
However, this safety margin usually has far less basis than a NASTRAN
analysis, and the results are often expensive.  Sometimes the slack is
consumed during implementation when it's discovered that the program just
doesn't run as fast as it was expected to, and sometimes the slack is consumed
during operation.  It is not uncommon for simple "data overrun" conditions to
lead to software crashes because the programmer never expected the input to
exceed the computer's speed.

The state of the art would be in far better shape if we had tools as
useful as NASTRAN for sizing computer systems.


Re: Danger of Sensitive Car Electronics

Will Martin — AMXAL-RI <wmartin@ALMSA-1.ARPA>
Mon, 22 Aug 88 16:13:59 CDT
Interesting that this topic should be brought up just now. I just received
in the mail a catalog on "Electrical Noise and Interference Control Courses,
Publications, & Services" from Interference Control Technologies (Star Route
625, PO Box D, Gainesville, VA, 22065 (703)347-0030), which has on its cover
a rather strange painting of a car going through a guardrail and over a cliff
as the result of a radio transmitter radiating above a mountain tunnel
opening. I say "strange" because the angle of the car and the tire tracks
look unnatural, but then I'm no traffic-accident specialist.

Anyway, those interested in this topic may want to write for this catalog
(it is the "September-February" issue) — the company holds seminars and
puts out books and symposium proceedings on the topic of EMC and EMI.

There's also a free controlled-circulation magazine on the subject; write
to EMC Technology, Circulation Dept, PO Box D, Gainesville, VA 22065
Note the similarity to the above address — I guess they are all part of
the same organization.

Please report problems with the web pages to the maintainer

x
Top