Forum on Risks to the Public in Computers and Related Systems
ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator
Volume 14: Issue 2
Monday 9 November 1992
Contents
Voting Machine Horror Story- Al Stangenberger
Phone voting in NM- Gary McClelland
Salvage Association vs CAP Financial Services- Les Hatton
Computer system blamed for lack of official trade figures- John Jones
Privacy Digests- PGN
Another TV show showing computer `hackers'- Matthew D. Goldman
Re: Encryption Keys- Glenn Story
Steven Tepper
D. Longley
Re: Risks Of Cellular Speech- Johnathan Vail
London Ambulance Service computer fails again- Tony Lezard
Paul Johnson
Re: Cash dispenser fraud- Thor Lancelot Simon
Antoon Pardon
Info on RISKS (comp.risks)
Voting Machine Horror Story
Al Stangenberger <forags@insect.berkeley.edu>
Fri, 6 Nov 92 09:20:45 PST
>From Daily Californian 6 November 1992 A misaligned Votomatic machine may have caused hundreds of Berkeley voters to cast their ballots for candidates and propositions they never intended to support, according to Alameda County Registrar of Voters Emmie Hill. Votomatic machines align a pre-scored data card with numbers on a printed picture of the ballot. Voters make their choices by punching a stylus through designated holes, thereby punching a data card. In this case, the holes were still within the printed squares on the ballot, but were actually causing holes to be punched in other locations on the card. An alert voter reviewed her completed ballot and noticed the discrepancy, but precinct workers did nothing until late afternoon, when a troubleshooter from the Registrar's office, acting on the voter's complaint, checked the unit and confirmed the voter's complaint. I had never thought of this type of error being possible - the ballots look so neat when they come out of the machine. Should we add a second definition to GIGO, namely Goodness In, Garbage Out, to cover such cases? Al Stangenberger Dept. of Forestry & Resource Mgt. 145 Mulford Hall - Univ. of Calif. Berkeley, CA 94720
phone voting in NM
Gary McClelland <mcclella@yertle.Colorado.EDU>
Mon, 9 Nov 1992 08:17:04 -0700
A group of Boulder residents has been actively campaigning for phone voting.
As a consequence, the Boulder media has followed phone voting efforts
elsewhere. Below is a summary and excerpts from an article by Carol Chorey in
the Boulder Daily Camera on 9 Nov 92 about phone voting developments in New
Mexico.
"New Mexico was spurred to develop a phone voting system by `lots of voter
frustration in getting to the polls,' said Russell Barnes, director for
information systems with the New Mexico secretary of state's office. Members
of his office started talking to the security experts at Sandia [National Labs
in Albuquerque] last March because `we thought we had the expertise with Sandia
Labs' so close at hand." Scientists at Sandia took on the work as a commercial
project rather than a federal one. The labs...are converting from federal to
commercial work because they have lost much of their funding."
The goal is to have phone voting as an option to voting at the polls in
the same way that voting absentee by mail is currently an option. Thus, one
would have to apply to vote by phone and the presumption is that not everyone
would do so. Just like absentee voters, phone voters would have two to three
weeks to call in their votes before election day so phone lines won't be
overloaded.
Security? " `Voting by phone is not really the issue--it's a simple
technology,' Barnes said. `The problem is people feel it is very unsafe
because it's so easy to tap into the phone system." "With the help of computer
security experts at Sandia...,the secretary of state's office is developing a
system officials say will be uncrackable." :-) As a test, the prototype system
was used in a mock election involving 2,300 high school students and their
parents on a facsimile of this year's NM ballot. During the week-long mock
election, "Sandia `black-hatters'--experts at breaking into telephone
systems--were unable to crack the identification system and could not create
duplicate votes." The article includes this cryptic comment: "An encoding
system still needs to be developed to make sure the system will be safe from
mass fraud."
New Mexico is spending about $2 million on the system.
They hope to recover these costs by selling the system to other
states.
gary mcclelland, univ of colorado, mcclella@yertle.colorado.edu
Salvage Association vs CAP Financial Services
Les Hatton, Programming Research Ltd., U.K. <lesh@prl0.uucp>
Mon Nov 9 17:57:00 1992
The legal side catches up on the (lack of) quality ... CAP loses legal battle: Computer weekly, Thursday, 29 October, 1992 The Salvage Association has won its year-long battle with CAP Financial Services over a botched computerised accounting system. The Salvage Association, which processes marine insurance claims, commissioned CAP to develop a bespoke accounts system based on software from relational database supplier Oracle in 1988. But the finished system was riddled with over 600 errors and had to be scrapped in 1989 after repeated attempts to modify it failed to produce a usable version. Last week Judge Thayne Forbes ordered CAP, now part of Anglo-French services group Sema, to pay the Salvage Association 662,926 UK pounds in damages. Further damages will be awarded next month when interest charges and legal costs will be taken into consideration ... I wouldn't be surprised here if the damages exceeded the development costs! Really bad code probably has an error about every 50-100 lines suggesting that the package might be around 30,000-60,000 lines. At around 5000 lines per programmer year for this kind of stuff and say 50,000 pounds per year for a programmer all in, the development costs would be around 360,000-600,000 pounds. I guess that this kind of thing will get much more frequent. Dr Les Hatton, Director of Research, Programming Research Ltd, England esh@prl0.co.uk (44) 372-462130
Computer system blamed for lack of official trade figures
John Jones <jgj@cs.hull.ac.uk>
Thu, 5 Nov 92 13:50:15 GMT
An article that appeared in the Guardian (7th October, 1992) suggests that the
British Government is unable to monitor trade sanctions imposed on Serbia
because there are no official trade figures:
Computer flaw lets Serb sanction-busting slip
David Hencke, Westminster Correspondent, The Guardian, 7th October, 1992
Details of Serbian sanctions busting are being wiped out every month by
the erase button of a Whitehall computer, undermining Britain's pledge
to the United Nations to root out and record the illicit trade.
Embarrassed Customs and Excise officials admitted to the Guardian
yesterday that they keep only one month's illegal trade figures for
Serbia and Montenegro because they have no back-up system to hold the
information for longer than 28 days.
``We weren't prepared for Yugoslavia to break up. We did not have a
programme for separate republics,'' an official explained.
``We don't keep the information because our principal client, the
Central Statistical Office, which records overseas trade figures,
intends to keep Yugoslavia as a single country until next year.'' [...]
I do not think it was clear in the article exactly what was going on, but my
interpretation is that the people who normally receive trade figures, the
Central Statistical Office, do not want them. The people who generate the
figures, Customs and Excise, are unable to save them because of an inadequate
computer system. This is convenient, because the article went on to state that
what trade figures are known show that trade has increased dramatically in a
number of areas, including the export of telecommunications equipment and
petrol (no specific figures quoted, so we could still be talking small
numbers).
This article caught my attention for several reasons:
- the standard of reporting (lack of firm detail, choice of language)
- people hiding behind inadequacy of computer systems (as usual)
- political capital out of the inadequacy of software
John Jones, Department of Computer Science, University of Hull, UK
Privacy Digests
"Peter G. Neumann" <neumann@csl.sri.com>
Mon, 9 Nov 92 176:30:03 PDT
Periodically I will remind you of TWO useful digests related to privacy,
both of which are siphoning off some of the material that would otherwise
appear in RISKS, but which should be read by those of you vitally interested in
privacy problems. RISKS will continue to carry higher-level discussions in
which risks to privacy are a concern.
* The PRIVACY Forum Digest (PFD) is run by Lauren Weinstein. He manages it as
a rather selectively moderated digest, somewhat akin to RISKS; it spans the
full range of both technological and non-technological privacy-related issues
(with an emphasis on the former). For information regarding the PRIVACY
Forum, please send the exact line:
information privacy
as the BODY of a message to "privacy-request@cv.vortex.com"; you will receive
a response from an automated listserv system.
* The Computer PRIVACY Digest (CPD) (formerly the Telecom Privacy digest) is
run by Dennis G. Rears. It is gatewayed to the USENET newsgroup
comp.society.privacy. It is a relatively open (i.e., less tightly moderated)
forum, and was established to provide a forum for discussion on the
effect of technology on privacy. All too often technology is way ahead of
the law and society as it presents us with new devices and applications.
Technology can enhance and detract from privacy. Submissions should go to
comp-privacy@pica.army.mil and administrative requests to
comp-privacy-request@pica.army.mil.
There is clearly much potential for overlap between the two digests, although
contributions tend not to appear in both places. If you are very short of time
and can scan only one, you might want to try the former. If you are interested
in ongoing detailed discussions, try the latter. Otherwise, it may well be
appropriate for you to read both, depending on the strength of your interests
and time available.
PGN
Another TV show showing computer `hackers'
Matthew D. Goldman <goldman@orac.cray.com>
Fri, 6 Nov 92 09:08:13 CST
You might want to turn into next week's Beverly Hills 90210 to catch the hacker subplot. Last week on 90210, One of the main characters obtained a password to the high school computer grade system via social engineering. Later he and a freshman computer dude (this is California after all) broke into the school and attempted to adjust school records. The password "JESTER" seemed to work at first; however, the system locked up and shutdown after a few minutes. The two kids fled. It will be interesting to see if there is an audit trail... Matt Goldman goldman@orac.cray.com
Re: Encryption Keys (RISKS-13.85)
<STORY_GLENN@tandem.com>
9 Nov 92 09:44:00 -0800
In reference to Dorothy Denning's five-step sceme to protect the government holding encryption keys: The Justice Department (and FBI) could dispense with steps 1-3 by the simple expedient of disregarding the use of due process. (It wouldn't be the first time.) Step four can be eliminated with standard well-known intercept techniques. This leaves only step five: "Listen in and decrypt the communications." Doesn't sound too tight to me. Regards, Glenn "When cryptography is outlawed, only outlaws will use cryptography."
Can encryption be defined precisely?
Steven Tepper <greep@speech.sri.com>
Wed, 4 Nov 92 13:47:01 PST
> Dorothy Denning suggested that anyone using high-level encryption over a public > network be required to register their encryption keys with some agency. (I assume this is supposed to mean decryption keys, not encryption keys.) The discussion of requiring the registration of decryption keys raises the question of whether "encryption" can be defined precisely enough to make this proposal workable. Can translating a message in a different language be called "encryption"? We would not normally use the word in this way. But if a bad guy wants to send a message to a crony without having the government be able to understand it, he can send it in some extremely obscure language (suitably transliterated into a standard character code if necessary) that they happen to know (or take the trouble to learn a little of). Will the government decide to outlaw all communications in foreign languages? Or publish a list of approved languages? Take this a step further. What's to stop the bad guys from creating their own language? (Say something like Esperanto but based on Navajo instead of Italian.) Let's say that the language is given a greatly simplified syntax, to make it easy to learn and remember. Can translating messages into this language be defined as encryption? If so, what is the appropriate decryption key? If the language changes fast enough, translation into it resembles the use of a one-time pad. Will the law attempt to distinguish between "real" and "artificial" languages and allow only the former? If so, would real languages that are now extinct be allowed? Would a language be allowed only if a written dictionary and description of its grammar exist in some more widely known language? To summarize, is it possible to define a precise distinction between encrypting a message and translating it into a different language? Is it possible to outlaw the former while permitting the latter? Remember that there will always be people pushing the limits of the law. Steven Tepper <greep@speech.sri.com> SRI International Menlo Park, California
FBI Registration
CA29F200CE9F204D17@qut.edu.au ?? <D.LONGLEY@qut.edu.au>
Tue, 10 Nov 92 09:07 +1000
Dennis Longley, Information Security Research Centre, Queensland University of Technology. HOW TEFLON JOHN COPED WITH KEY REGISTRATION. 1. Teflon John registers DES key K1 with the FBI and announces that he is using 4-bit cipher feedback, IV to be a preamble to message. 2. Teflon John gives MAC the KNIFE a second DES key K2 and a software package. 3. Teflon John wants to tell MAC the KNIFE to make an offer that LOUIE can't refuse, but he doesn't want to spoil the surprise to LOUIE or the FBI. Moreover he does not want to involve near MAC the KNIFE, who can read easy words but can't write real good, in complex codebook systems. The scheme must be virtually transparent to both sender and receiver. 4. Teflon John produces plaintext " Give Louie a nice present at 6 pm on Monday' encrypts it with K2 giving ciphertext C1. 5. Teflon John's problem is that although the FBI can't decipher C1, they will get suspicious if they tap a message that cannot be decrypted with K1. He has to find a mechanism of transmitting C1 so that it appears to be an innocent message when decrypted with K1, but a true message when decrypted with K2. 6. Teflon John breaks C1 up into 4 bit messages and produces a host, about 30-40, of innocent messages I1 - In. 7. The messages I1 to In are encrypted with 4 bit cipher feedback using K1. The first 4 bits of each message are scanned until one is found with the same first 4 bits as C1. Since there is a 1:16 probability that two four-bit messages are identical then with 32 messages there is an 87% probability that a match can be found. Just keep generating messages if necessary until match message Im is found. The encrypted message eK1(Im) is transmitted. The FBI decrypt a harmless message, MAC the KNIFE uses the software package to retain the first 4 bits of the ciphertext eK1(Im). 8. Next get the second nibble from C1 and search through the ciphertexts eK1(Ip) for the second innocent message to be transmitted i.e., the one whose second nibble is identical with the second nibble of C1. MAC the KNIFE's software package retains the second nibble of this message. 9. Proceed in this manner until MAC the KNIFE has C1 which is then decrypted with K2, and goodbye Louie. 10. Now the same innocent messages can be used over and over again, they will be simply sent in a different order. If the FBI are getting suspicious then variants of the message can be made, any variation after the 4 bits used will not affect the process. I guess that the FBI cryptanalysts could get suspicious about the burst of similar innocent messages preceding expensive funerals, but their prosecuting attorneys would not have an easy job, particularly if Teflon John had a good key management scheme for K2. The only prosecution evidence would be a set of ciphertexts that they claimed represented evil intent, but they could not produce the corresponding plaintext messages. 11. Everybody is happy with the system, Teflon John is getting his messages through securely, the FBI assures itself that it is doing a good job keeping Teflon John on the straight and narrow, and AT$T is getting a much larger telephone income from Teflon John. Louie is not so keen but nobody liked him anyway. 12. The 4 bit cipher feedback allows Teflon John to reduce his phone bill with more sophisticated approaches where more than 1 nibble is used per message. With 4 bit cipher feedback there is more control over the ciphertext so that the messages can be modified easily to produce the requisite nibbles corresponding to the ciphertext used by MAC the knife.
Re: Risks Of Cellular Speech (RISKS-13.89)
Johnathan Vail <vail@tegra.com>
Wed, 4 Nov 92 17:40:57 EST
Dave King <71270.450@compuserve.com> and Peter G. Neumann talk about the "Privacy" issues of cellular phones. They point out that cellular phones are not private and mention many incidents where this is illustrated. [Technical note: the Dan Quayle call was probably not cellular phone but a dedicated or airphone frequency. Cellular from aircraft is illegal because the aircraft can "see" too many cells at once. JV] It should be obvious to anyone that using radio to communicate over distances of many miles is never private. The real RISK involved is why people think it can be private in the first place. In the US we have laws specifically outlawing listening to cellular phone calls and they recently passed a law banning the sale of scanners that can receive cellular phone. But even banning new scanners will not eliminate the millions already sold or even older UHF TVs that can tune those frequencies. These laws can not change the laws of physics but they will promote the *myth* of privacy. People using cell phones are led to believe that it is just like a wire phone. Cell phone vendors fought for a law that cannot be enforced rather than a law that requires them to warn their customers that there is no privacy. It is cheaper for them to bury their head in the sand then to invest in digital technology that will offer privacy. And for conspiracy theorists: remember the discussion where the FBI and NSA are trying to delay and weaken digital techniques for cell phones. Who do you think is doing a lot of listening now? Johnathan Vail vail@tegra.com jv@n1dxg.ampr.org (league@prep.ai.mit.edu) MEMBER: League for Programming Freedom N1DXG@448.625-(WorldNet) 508-663-7435
London Ambulance Service computer fails again
Tony Lezard <tony@mantis.co.uk>
Thu, 05 Nov 92 10:47:30 GMT
The Times of 5 November 1992 reports that Britain's biggest ambulance service, the London Ambulance Service, yesterday reverted to full manual control after another failure in its computer management system forced senior management to concede it could not cope with its task. It reports that the health secretary Virginia Bottomly was studying a letter from a York-based computer consultancy claiming that tenders for the contract from experienced providers of command and control systems were ignored in favour of the lowest bid. Control room staff had noticed early yesterday that system response was slowing and computer back up procedures had failed to solve the problem. Because of the faults, there was a 25 minute delay in dispatching an ambulance. Ten days earlier, the service's chief executive, John Wilby, resigned following allegations by the public sector union NUPE that a similar failure could have contributed to the loss of 20 lives. The LAS has challenged NUPE to substantiate its claim. Martin Gorham, acting chief executive of the LAS said that yesterday's problems occurred when demand was low, and not as a result of system congestion as had been the case previously. In a statement he said "Since the problems with the computer system at the beginning of last week, the LAS has been operating extra back-up systems, including paper duplicates and voice confirmation to crews. In addition, the staffing of the control room has been significantly increased. As a result of these measures, call answering times have substantially improved, leading to greater efficiency in allocating ambulances." Tony Lezard: tony@mantis.co.uk Mantis Consultants Limited, Cambridge, UK. Alternative email: tony%mantis.co.uk@pipex.net or arl10@phx.cam.ac.uk
London Ambulance Dispatch Computer
paj <paj@gec-mrc.co.uk>
5 Nov 1992 09:22:48-GMT
The London Ambulance Service computer fiasco rumbles on.
According to "Computing" (5 Nov 92) the senior officials of LAS were warned
that the system would be an "expensive disaster" by Michael Page. Page's
company submitted a competing bid which was rejected last year, and he wrote a
series of memoranda to LAS in June and July 1991 warning that the mapping
subsystem (which tracks ambulances and dispatches the nearest to an incident)
was not up to the requirements. "The rule-based, analytical approach used by
the LAS cannot deal as well as an experienced operator with the small minority
of difficult cases. The system wrongly reduces the influence of operators".
Meanwhile Mike Smith, systems manager at LAS stated "One thing that did not
fail was the computer. What seems to have gone wrong is that the people
working on the system were flooded with exception messages - we don't yet know
why. We may have lost local knowledge by breaking up sector desks at the
weekend."
LAS have now gone back to a hybrid system using human expertise to dispatch
ambulances rather than a computer. The two days of chaos last week may have
cost up to 20 lives, although exact figures are of course impossible to obtain.
Paul Johnson (paj@gec-mrc.co.uk). | Tel: +44 245 73331 ext 3245
[Also reported by Chris Welch <me_s420@ceres.king.ac.uk>, with
more from Brian Randell (Brian.Randell@newcastle.ac.uk) and
Lord Wodehouse <w0400@ggr.co.uk>. Complete articles from Computer
Weekly, The Guardian, The Independent, and Computing can be
FTPed from the file risks-14.02LAS in the RISKS: directory. PGN]
Re: Cash dispenser fraud (Kristiansen, RISKS-13.89)
Thor Lancelot Simon <tls@panix.com>
Mon, 9 Nov 92 00:23:31 EST
I had an experience earlier this week that indicates that banks hereabouts
(Connecticut -- but probably at least the Northeastern US in general) are aware
of this problem, and have realized a `solution'. I believe the ATM machines
with the `jaws' are Diebold TABS machines, but I might be wrong. In any case,
I ran into one of these machines with a rather shorter than usual timeout
period, grabbed for my money just as it was pulled back, and didn't get it in
time. I then found that my account had _not_ been credited for the money I
hadn't received, and had a fair bit of difficulty getting the bank in question
to credit me for it when I showed up the next day to complain. I suppose that
since one doesn't get credit for the money the machine takes back in, it's
impossible to take part of the money and run. On the other hand, why is it
even necessary for the machine to pull the money back at all, if it's not going
to give credit for it? This `fix' seems to me to be blatant opportunism on the
part of the banks. I know that among the readers of this list are some ATM
programmers; any comment?
[REPLY TO tls@panix.com on this one, please. Cc: risks if you wish. PGN]
Re: Cash dispenser fraud (Mellor on Kristiansen, RISKS-14.01)
Antoon Pardon <apardon@vub.ac.be>
Thu, 5 Nov 92 8:50:47 MET
I don't know the legal situation in other countries. But if I understand it
correctly in Belgium the bank can't refuse to give you your money when you
state that you didn't receive it. This is because you don't sign a note when
getting money from a cash dispenser. So the bank has no prove that it handed
the money out to you. Counting the remaining money in the till and checking
with the balance does no good since the money could have been taken by the
following client.
Antoon Pardon <apardon@vub.ac.be>

Report problems with the web pages to the maintainer