There are reports in today's UK press (echoing previous reports) that "Southern" (New Southern Railway Ltd - a railway operator around London, England) is having problems with the doors in its new, multi-million (Pounds Sterling), trains. In some cases these will not open even when the train is stopped at a platform. One train is reported to have trapped its passengers inside for 45 minutes. It appears that the new trains are equipped with a GPS-based system to determine whether the train is stopped at a platform or not and whether the platform is long enough to allow (all) the doors to be opened. If the system does not determine this, the doors will not open. There have been a number of accounts of the system failing to detect the GPS signal and reports too that the received signal has overloaded the system. The problem has been compounded by some drivers' inexperience of the new trains leading to their inability to determine promptly how to open the doors manually. Quoting a spokesperson for Southern, "[The trains] have a selective door opening system on board which takes a combination of GPS satellite signals to tell the train exactly which station it is located at and then ensure only the number of doors that are accommodated on the platform are opened. [ ... ] Sometimes we have had problems with the train not locating itself and thus not opening the doors at all." Apparently, in addition, the system sometimes does not recognise that the train has stopped at a longer platform (than originally scheduled) and the system has to be "reprogrammed" by the driver before the train can proceed. Well, I never! The train has a driver, but a complex technology-based system is installed to open the doors. The driver can override the system, but doesn't know how to - but has been instructed how to reprogram certain parameters into the computer. The RISKS are too obvious and too numerous to mention, but isn't this so typical of many designs today? Throw out the simple, well-tried and working system and introduce a complex, untried and liable-to-failure system. One can assume that someone thought this was a safety feature, but is this a "fail-safe" system? One would have assumed that the designers would have taken cognisance of the fiasco on the (London's) Docklands Light Railway when the inaugural train carrying Her Majesty the Queen failed to align properly (by a few inches, IIRC) at one station and the system locked her inside the carriage ... but that's clearly a naive assumption!
A computer glitch shut down computers for more than an hour at all 136 secretary of state offices in Illinois beginning at 9:30am on 2 Sep 2004. This delayed people who were trying to obtain driver's licenses, renew registrations, or conduct other business — although those with preprinted renewal forms were able to be helped. [Source: *Chicago Tribune*, 3 Sep 2004; PGN-ed]
I just went to set up our conference room for a meeting tomorrow morning. I had to blunder around in the dark, at first. The light switches have been replaced by a computerized lighting control. Turning on the lights requires finding the control, selecting the appropriate menu, and then selecting the appropriate item to turn on the room lights. Once I'd done that, I was able to see that the computer that turns the room projectors on and off needed urgent virus updates run on it. Thankfully, nobody has thought to computerize the toilets yet. :-) [Nobody? See RISKS-21.35, 22.73, and 23.20, for example. It may be a slippery slope, but it's happening! PGN]
Calif. Schools Warned of Identity Theft, Associated Press, 2 Sep 2004 http://www.washingtonpost.com/ac2/wp-dyn/A57539-2004Sep2?language=printer California university officials have warned nearly 600,000 students and faculty that they might be exposed to identity theft following incidents where computer hard drives loaded with their private information were lost or hacked into. Since January, at least 580,000 people who had personal information about them stored in university computers received warnings they might be at risk. The latest instance of missing equipment occurred in June at California State University, San Marcos. An auditor lost a small external hard drive for a laptop computer. Personal data, including names, addresses, Social Security numbers and other identifiers for 23,500 students, faculty and staff in the California State University system were contained on the missing hard drive. At the University of California, San Diego, and San Diego State University, hackers broke into computers and obtained access to files of personal data for more than 500,000 current or former students, applicants, staff, faculty and alumni. Officials from the Cal State system and UC San Diego said they have no evidence any personal data were stolen. At the University of California, Los Angeles, a stolen laptop in June led officials to notify as many as 145,000 blood donors that their data might be in the open. A California law requiring people be notified when they might be exposed to identity theft took effect in July 2003. Officials say that might explain the rash of notices. "There's no reason to assume that suddenly in July 2003 all these computer security breaches started occurring," said Joanne McNabb of the Office of Privacy Protection in the California Department of Consumer Affairs. "It's just that we know about them now, when we didn't hear before." .... = ================ And yet, most schools still try and require students to furnish SSN's to register. I've never once seen a good reason for it, and in fact, they regularly furnish "shadow numbers" for foreign students, and upon request, others. And why is this "auditor" carrying the data around at all? As for blood donations; the Red Cross is regularly crying for more donors. Wonder how many others, like me, refuse because they started demanding SSN's?
Missouri's Secretary of State Matt Blunt (who also happens to be a candidate for the Governor of Missouri in the November election) has announced plans to allow Missouri voters in the military to send in their ballots by unencrypted e-mail. A supposedly trusted third party (Omega Technologies) will handle the unencrypted ballots and redistribute them to the appropriate ballot counters. Apparently North Dakota is also contemplating a similar scheme. Those voters will have to sign a waiver acknowledging that their votes need not be kept secret. I suspect regular RISKS readers will be (1) astounded, (2) horrified, and (3) concerned that certain e-mail messages could be altered or "accidentally" lost, (4) concerned that the acknowledged loss of privacy might be used to coerce votes or prompt vote selling, (5) etc. There are reportedly at least six million eligible overseas voters who might wish to have the instant satisfaction in believing that their votes might be counted, but there are also at least six million votes that might thereby be subject to compromise. [This story may have even more legs than the many on electronic voting systems. For example, see *The New York Times* editorial, The Pentagon's Troubling Role, 31 Aug 2004.]
Nevada voters have become the first in the nation to cast ballots in a statewide election using computers that produced printed paper records of electronic ballots. "Knock on wood, so far things have been working flawlessly," said Secretary of State Dean Heller. Nevada's $9.3 million voting system includes more than 2,600 computers and printers deployed in every county. The system, developed by California-based Sequoia Voting Systems, aims to address concerns that paperless touchscreen votes cannot be properly audited or recounted. "From what I've seen, voters seem to enjoy the experience," says DeForest B. Soaries Jr., chairman of the U.S. Election Assistance Commission. "There hasn't been frustration or confusion." [AP/*USA Today*, 8 Sep 2004; NewsScan Daily, 8 Sep 2004] <http://www.usatoday.com/tech/news/techpolicy/evoting/2004-09-08-nv-evote-system_x.htm
[From Dave Farber's IP distribution, excerpted for RISKS by PGN] http://www.blackboxvoting.org/?q=node/view/78 Consumer Report Part 1: Look at this — the Diebold GEMS central tabulator contains a stunning security hole Submitted by Bev Harris on Thu, 08/26/2004 - 11:43. Investigations Issue: Manipulation technique found in the Diebold central tabulator -- 1,000 of these systems are in place, and they count up to two million votes at a time. By entering a 2-digit code in a hidden location, a second set of votes is created. This set of votes can be changed, so that it no longer matches the correct votes. The voting system will then read the totals from the bogus vote set. It takes only seconds to change the votes, and to date not a single location in the U.S. has implemented security measures to fully mitigate the risks. This program is not "stupidity" or sloppiness. It was designed and tested over a series of a dozen version adjustments. Public officials: If you are in a county that uses GEMS 1.18.18, GEMS 1.18.19, or GEMS 1.18.23, your secretary or state may not have told you about this. You're the one who'll be blamed if your election is tampered with. Find out for yourself if you have this problem: Black Box Voting will be happy to walk you through a diagnostic procedure over the phone. E-mail Bev Harris or Andy Stephenson to set up a time to do this. [...] Full item archived at http://www.interesting-people.org/archives/interesting-people/
*The Economist* (4-10 Sep 2004) has an interesting article covering the verification of Venezuela's referendum on the recall of President Hugo Chavez ("What really happened in Venezuela?", pp. 52-54). The article is written by Jennifer McCoy who directed the Carter Center's observer mission in Venezuela. Two separate tests were used to verify the transmission of the results from the polling stations to the Venezuela National Election Council (CNE) headquarters and the tabulation of the results by the CNE's computers. Note that these tests have nothing to do with the electronic voting machines, they cover other elements of the election system that could also be manipulated. The verification of the electronic voting machine results was performed using paper ballots that the machines printed and the voters inspected before depositing in a ballot box. According to McCoy, the existence of this paper trail allowed the observers to verify what happened "within the black box of the voting machines". Two audits were performed on these paper ballots: a half-completed immediate "hot audit", and a second audit three days after the election. Interestingly, the observers also covered the possibility of someone having the machines reprogrammed to produce new paper ballots matching the results and inserting them into the ballot boxes. Specifically, observers were placed in the military garrisons where the ballot boxes were kept, before drawing the random sample of the voting machines to be audited. I can see two aspects of this story that will interest RISKS readers. First of all, the successful use of the electronic voting machine paper trail to verify a crucial element of the election process. Secondly, the fact that the mere existence of a paper trail does not in itself guarantee fair, or even verifiable, election results. The election process is a complex system; its auditing must take into consideration all elements of the system. In this case the Carter Center had been mediating in Venezuela for two years, and all tests and observer accessibility requirements were planned in advance. The complete article is available on-line http://www.economist.com/opinion/displaystory.cfm?story_id=3157671 More information, including the audit of the results, can be found on the Carter Center's Web site http://www.cartercenter.org/doc1690.htm Diomidis Spinellis - http://www.dmst.aueb.gr/dds Athens University of Economics and Business
[A second report on the same article, different enough for inclusion. PGN] *The Economist*, 4-10 Sep 2004, contains an article by Jennifer McCoy, who directed the Carter Center's observer mission during the Venezuelan election. According to Ms. McCoy, the opposition, who lost the election, contested the results based on three phenomena. The second was that "there was a pattern of polling stations where several electronic voting machines returned an identical result, in what looked like a pre-programmed "cap" on the number of opposition votes." The Carter Center planned three tests of the electronic voting system. First, their observers at a random sample of polling stations called in results to mission headquarters, so they could be checked against the official results transmitted from the machines to the National Election Council (CNE). Second, they drew a larger sample of poll results from those received at CNE headquarters, to test the accuracy of tabulation by CNE computers. But as McCoy says, "missing from those tests was what happened within the black box of the voting machines." They approached that issue as follows. The voting machines had a paper trail. The machines printed a record for the voter of hisher vote, and after inspection were deposited by himher in a cardboard ballot box. The Carter Center was able to perform a "hot audit" of about 1% of the machines, but only managed halfway to complete it. They proposed and completed a second audit three days after the vote. The audit was suitably blind and insulated from manipulation (McCoy gives the details), and verified in a practical manner that the machines had accurately reported results. The identical results from some machines fell within the range of probability, as shown by Stanford University statistician Jonathan Taylor. McCoy suggests that cooperation from the CNE was less than perfect, but largely harmed itself: "the vote itself was secret and free, but the CNE's lack of openness, last-minute changes and internal divisions harmed public confidence in that vital institution both before and after the vote." She suggests that in general Venezuelans have become more cynical towards elections and that it will take a "huge effort by both sides to restore trust in this fundamental democratic right before next month's election for governors and mayors." It appears to have been essential to the efforts to verify the accuracy of the election that the voting machines generated a paper trail in a specific manner. One imagines from McCoy's comments that without the successful effort to verify the results of the machine voting through that trail, any faith in the democratic process could well have been irretrievably lost during this election. I worry that North America and Europe might suffer a similar loss of faith, over a longer time period, unless similar effective backups, and verifications, are required for electronic voting methods in our supposedly more mature democracies. For motives for manipulation are just as strong no matter where one lives, and no country anywhere lacks people who rate voting accuracy less important than obtaining their favored result. Peter B. Ladkin, University of Bielefeld, Germany http://www.rvs.uni-bielefeld.de
At a bank recently I watched someone use the ATM to pay their Mastercard bill. Since they weren't making an effort to be very close to the screen, I was able to watch as they went through the menu choices. Then onto the screen popped their credit card number: all 16 digits of it, large and readable from a few feet away. The customer went on to continue the transaction, but it mystified me that the bank saw it necessary to display the entire number. Even if you've got multiple cards, it would seem to me that only the last four digits, perhaps, are what a customer would need to make sure they're paying off the card they intended. Right now, the design lets anyone who can see the screen a brief view of the number---in my case approximately 5 seconds. The biggest risk was possibly the most threatening: the number was certainly there long enough for someone talented in number memorization to walk away with it. Brendan Kehoe <email@example.com> http://www.zen.org/~brendan/ [Don't forget the ease of digital cameras, long lenses, etc. PGN]
Dan Bricklin's elegant essay on the lessons for system design and use of on-line and other information sources http://www.bricklin.com/learningfromaccidents.htm is very informative and makes some excellent points around the ability and availability of the general public as a participant in disaster recovery. It nicely validates what every IT prophet has been saying, in one form or another, since the early 90s: Increases in communications and information processing capability will lead to more consumption of that resource, enabling organizations to quickly respond to outside changes — indeed of "spontaneous" organizations to quickly form to address issues. Dan makes some great points around what this means for systems and component design. However, there is one problem with using open tools, such as RSS feeds, blogs, wikis and open conference calls: Their very openness makes them a path for a future terrorist. A group of terrorists wanting to do something akin to the 9/11 action could now learn from what happened then, and include a number of on-line participants with a role in spreading misinformation, increasing fear and diverting resources. There were instances of misinformation during 9/11 — I remember news items about a carful of bombs being stopped on some bridge in New York for instance — and the news channels normally apply some form of fact-checking. While the Wikipedia model works great when there is time — and people check changes against a contributor's past behavior, I think we should be careful with too much openness in a time-pressured situation. Some form of validation needs to be in place. Espen Andersen (firstname.lastname@example.org), Norwegian School of Management (www.bi.no) www.espen.com, +47 6755 7177; The Concours Group www.concoursgroup.com
..and as usual, Wired gets the story sideways. The file system used is a FAT (File Allocation Table) system that was popularized by MS-DOS. The rovers use VxWorks as their operating system; a FAT-compatible file system is provided with VxWorks, but is not the only option. The FAT system is a notoriously bad choice for use with Flash memory devices, because any changes to the files requires an update to the table. This would quickly wear out the flash sectors where the FAT is stored. For this reason, the FAT and directory structure is copied to RAM, changes are made there, and at some point the updated FAT and directory are written back to Flash. The fact is, the rover filesystems contained hundreds of files from the flight portion of the mission; these files were not deleted, resulting in a directory that grew too large for available RAM. Regardless, the directory never shrinks; the directory entry for a deleted file is simply marked with a special character, and its chain of FAT entries are marked as available. This was common knowledge circa 1985, and made Peter Norton a rich man, for his "discovery" that the file was never truly deleted at all...
BKSACSNI.RVW 20040721 "Security Assessment", Greg Miles et al., 2004, 1-932266-96-8, U$69.95/C$89.95 %A Greg Miles email@example.com %A Russ Rogers firstname.lastname@example.org %A Ed Fuller %A Matthew Paul Hoagberg %A Ted Dykstra %C 800 Hingham Street, Rockland, MA 02370 %D 2004 %G 1-932266-96-8 %I Syngress Media, Inc. %O U$69.95/C$89.95 781-681-5151 fax: 781-681-3585 www.syngress.com %O http://www.amazon.com/exec/obidos/ASIN/1932266968/robsladesinterne http://www.amazon.co.uk/exec/obidos/ASIN/1932266968/robsladesinte-21 %O http://www.amazon.ca/exec/obidos/ASIN/1932266968/robsladesin03-20 %P 429 p. %T "Security Assessment: Case Studies for Implementing the NSA IAM" The introduction tries to explain the NSA (National Security Agency) IAM (Information Assurance Methodology), but is so heavily larded with (management) buzzwords that no clear concept emerges. The indications are that the book is primarily aimed at those who have taken one of the IAM courses, although there is an explicit statement that the material can be used by untrained professionals and also by the "customers" who are undergoing an assessment. Chapter one describes IAM in words that make it seem very similar to such tools as CoBIT (ISACA's Control Objectives for Information Technology tool), ISO 17799, and the NIST (the US National Institute of Standards and Technology) self-assessment guide. However, almost all of the chapter is devoted to a promotion of sharp negotiation of the scope of an IAM contract, from the vendor perspective. Chapter two reiterates the need to control customer expectations and define contract objectives. (There is more jargon, and also the use of idiosyncratic and undefined acronyms like PASV [Pre-Assessment Site Visit].) The Organizational Information Criticality Matrix (OICM) described in chapter three is a kind of simplistic business impact analysis. In chapter four, system information criticality and the System Criticality Matrix (SCM) are said to be more detailed than the OICM. Defining system boundaries is acknowledged to be difficult, but neither the explanation nor the examples used are of any help in clarifying the issue. Both the text and the tables used in the "case study" are extremely confusing in regard to the relation between entries in the OICM and the SCM. The system security environment, described in chapter five, is what most people would know as corporate culture: the general attitudes and behaviours common to an institution. The book suggests finding and using the CONOPS (concept of operations) documentation while admitting that it may not be found in most commercial enterprises. (The authors don't explain that this is basically identical to the common policy and procedures manuals, although they do eventually get around to mentioning these texts.) The TAP (Technical Assessment Plan) is actually just a specific format for a detailed contract, so we have to go through all of that type of editorial comment again, without really getting much information about the recommended TAP structure. Chapter seven involves the assessment itself, and generally deals with administrative details--and making sure that the customer does not modify the scope of the contract. The eighteen basic information security models get listed, although this seems to be almost an afterthought, rather than the core of the IAM itself. Findings, the report of the assessment results, are described in chapter eight. A sixteen page example does little more than provide a format. The close out report, in chapter nine, is a final sales meeting with the customer. The final report is given in a different, and more general, format in chapter ten. Cleanup work and followup sales of consulting are discussed in chapter eleven. The constant repetition of very basic ideas and the turgid and buzzword-laden text make this work far longer than is justified by the information provided. In addition, the extreme emphasis on the viewpoint of a vendor trying to sell a contract (and protect himself from doing any unbillable work) is a severe limitation on the audience for this tome. Essential components of the IAM model and process do not seem to hold any central place in the book, and the reader discovers them almost by accident, and despite of the writing rather than because of it. copyright Robert M. Slade, 2004 BKSACSNI.RVW 20040721 email@example.com firstname.lastname@example.org email@example.com http://victoria.tc.ca/techrev or http://sun.soci.niu.edu/~rslade
Please report problems with the web pages to the maintainer