This is a press release <http://www.counterpane.com/cmea.html> from * Bruce Schneier, Counterpane Systems, 612 823-1098 email@example.com * David Wagner, University of California, Berkeley 510-643-9435 firstname.lastname@example.org * Robert Sanders, University of California. Berkeley 510-643-6998 email@example.com * Lori Sinton, Jump Start Communications, 415-938-2234 firstname.lastname@example.org Telecommunications Industry Association algorithm for digital telephones fails under simple cryptanalysis MINNEAPOLIS, MN. AND BERKELEY, CA., March 20, 1997 - Counterpane Systems and UC Berkeley jointly announced today that researchers have discovered a flaw in the privacy protection used in today's most advanced digital cellular phones. This discovery points to serious problems in the chosed-door process used to develop these privacy measures. This announcement is a setback to the US cellular telephone industry, said Bruce Schneier of Counterpane Systems, a Minneapolis, MN consulting firm specializing in cryptography. The attack can be carried out in a few minutes on a conventional personal computer. Schneier and John Kelsey of Counterpane Systems, along with graduate student David Wagner of the University of California at Berkeley, plan to publish their analysis in a paper entitled "Cryptanalysis of the Cellular Message Encryption Algorithm (CMEA)." Legislators are scheduled to hold hearings today on Rep. Goodlatte's "SAFE" (Security And Freedom Through Encryption) bill, HR695. The problem affects numbers dialed on the key pad of a cellular handset, including any telephone, PIN, or credit cards numbers dialed. The system was supposed to protect the privacy of those dialed digits, but the encryption is weak enough that those digits are accessible to eavesdroppers with a digital scanner. The cryptographers blame the closed-door design process and excessive pressure from U.S. military interests for problems with the privacy standard. The cellular industry attempted to balance national security with consumer privacy concerns. In an attempt to eliminate recurring security problems, the cellular standards arm of the Telecommunications Industry Association(TIA) privately designed this new framework for protecting cellular phones. The system uses encryption to prevent fraud, scramble voice communications, and protect users' privacy. These new protections are being deployed in today's digital cell phones, including CDMA, NAMPS, and TDMA. Not a new problem As early as 1992, others - including noted security expert Whitfield Diffie - pointed out fatal flaws in the new standard's voice privacy feature. The two flaws provide a crucial lesson for policy makers and consumers, the researchers said. These weaknesses are symptomatic of broad underlying problems in the design process, according to Wagner. Many have criticized the National Security Agency (the U.S. military intelligence agency in charge of electronically monitoring foreign powers) for insinuating itself into the design process, pressuring designers to cripple the security of the cellular encryption technique and hamstringing emerging cellular security technology. "The result is weaker protection for everybody," Kelsey said. "This is another illustration of how U.S. government efforts to control cryptography threaten the security and privacy of Americans," said David Banisar, attorney for the Electronic Privacy Information Center in Washington, D.C. This is not the first report of security flaws in cellular telephony. Today, most cellular phone calls can be intercepted by anyone in the area listening to a scanner, as House Speaker Newt Gingrich learned this past January when someone with a scanner recorded one of his cellular calls. According to FCC estimates, the cellular telephony industry lost more that $400 million to fraud and security problems last year. CMEA Technology CMEA is a symmetric cipher, like the Digital Encryption Standard (DES). It uses a 64-bit key, but weaknesses in the algorithm reduce the key to an effective length of 24 or 32 bits, significantly shorter than even the weak keys the U.S. government allows for export. Greg Rose, program chair of the 1996 USENIX Security Symposium, put the results in context: This break does not weaken the digital cellular fraud protections. And it's still true that digital cellular systems are much harder to casually eavesdrop on than analog phones. But it's clear from this break that a determined criminal with technical resources can intercept these systems." Counterpane Systems is a Minneapolis, MN-based consulting firm specializing in cryptography and computer security. Bruce Schneier is president of Counterpane and author of three books on cryptography and security. David Wagner is a founding member of the ISAAC computer security research group at UC Berkeley. In the Fall of 1995, the ISAAC group made headlines by revealing a major flaw in Netscape's web browser. The authors also hasten to thank Greg Rose for his advice. [This was also noted by "Tom Zmudzinski" <email@example.com>. Several others contributed John Markoff's article in *The New York Times* today. As usual, my local source, *San Francisco Chronicle*, ran the NYT item without indicating its author. PGN]
This was posted on the rec.arts.sf.tv.babylon5.moderated group (Babylon 5 TV series) It is one of the most understandable passages on the problem I have seen. Actually several different problems. The Risk? The problem we are all vetching about may not be the underlying problem that will kill us. Gary Grossoehme, Oregon Electronics, GaryG4430@aol.com Date: 20 Mar 1997 00:08:06 -0500 >From: Troy_Heagy@ccmail.orl.lmco.com Newsgroups: rec.arts.sf.tv.babylon5.moderated Subject: "The Illusion of Truth" in action Here is a good example of the press distorting the truth just as in the recent Babylon 5 episode, "The Illusion of Truth." Hard Pressed Tech journalists are more interested in crises like the Explorer bug than the fundamental problems behind them. Ever wonder how the news really works behind the scenes? I got a powerful firsthand lesson on 3 March, when Worcester Polytechnic student Paul Greene discovered that "serious flaw" in Microsoft's Internet Explorer. That's when I became the unwitting source of a sound bite that overshadowed the real news. My first indication that something was up was an e-mail from Gene Spafford, who has been my co-author and editor on three computer-security books. Gene subscribes to firstname.lastname@example.org, a "full-disclosure" mailing list about hot computer security holes. The subject line was "FYI - browser bug." The message pointed to Greene's Cybersnot Web page. As I read the message, my jaw dropped. "Cool," I thought. "I can run any program I want on anybody's computer who looks at my Web page with Internet Explorer." Sort of like ActiveX without the code-signing. Five minutes later, my phone rang. It was Thomas Reardon, who works at Microsoft on IE. "I want you to know that this isn't an ActiveX problem," were the first words out of his mouth. I told Reardon that I had read the Cybersnot message and didn't think that this IE problem was any more significant than the numerous security problems that have plagued Netscape's Java engine. After all, the Secure Internet Programming group at Princeton University had discovered a dozen or so ways of making Java Virtual Machines run arbitrary machine code. The only difference between their attacks and this one was that you needed to be fluent in Java bytecodes, x86 assembler language, and obscure type systems in order to exploit the Princeton attacks. For the Greene bug, all you needed to know was HTML. But Reardon was worried. He said that his co-workers at Microsoft were certain that the press was going to burn them alive. And the bug was so simple - just two flipped bits in IE's registry entries. Internet Explorer has a list indicating whether files are safe or dangerous to open, Reardon explained to me. URL files and LNK files had been listed as safe, meaning it's OK for IE to open them without first asking the user's permission. They should have been listed as dangerous. Next, my pager went off. My friend Beth Weise, cyberspace correspondent for the Associated Press, wanted me to call another reporter and fill him in. I tried to stress to the reporter that the real problem wasn't Internet Explorer - it's the fact that people use the Windows operating system, which has no built-in security. "What we really need is secure operating systems, but corporate America doesn't buy them," I said. But that didn't make for a good story. The AP story must have gone out over the wire about 10 minutes after I hung up the phone, because I had just sat down to dinner when my phone rang again. This time it was CBS Radio News. They wanted to do an interview-to-tape right then! So I told the guy from CBS the same thing I had told George from the AP. The impact of this bug, I said, was that it acted as if somebody were messing with your computer while you "went out to lunch." That "lunch" quote had wings of its own. Within the next 24 hours I was quoted on CNN, CNBC, National Public Radio, and in dozens of publications. The Seattle Times ran my quote. It was really weird, because the woman who wrote the story knows me, knows my home phone number, but she found it easier just to grab the quote from the AP than to call me up and get the story behind the sound bite. This sort of quote reuse is actually typical for the nation's news services. I shouldn't be surprised. But I was upset that everybody focused on the immediate problem - a bug (oh no!) in Internet Explorer. Nobody asked why today's computers are so brittle that a single bug could leave a Web surfer wide open to attack. Nobody made the connection between this bug in Internet Explorer and ActiveX. Microsoft goes to great pains to make sure that security-critical bugs like this don't slip into its applications, and yet this one did. What about signed ActiveX components? They're sure to have security-critical bugs as well - especially since many of them will be written in C++. This is a problem that Java applets simply don't have, because they run within the restricted sandbox environment. Nobody seems to be looking to the future. We're building a wired world, but all those wires are crossed. We've had a lot of warnings. Pretty soon, we're going to start having disasters. It's time we started looking harder at the threats.
Yesterday, InterNIC, having lost the receipts from my employer's domain registration, invalidated my employer's top-level domain. (Note that this mail is NOT sent from my employer's domain, but from my local ISP; I am not authorized to speak for my employer.) Like many others in similar situations, our system administrator spent the entire day trying vainly to get in touch with a human being at InterNIC to correct the error. He was eventually successful, but only after 20 hours of "Internet death" during which all E-mail, FTP, and accesses to our Web page bounced. Moral? A single point of failure is just as serious when the point of failure is an organization as when it is an hydraulic valve. Many companies now arrange for backup Internet access should their primary provider fail; as far as I know, there is no way to arrange for backup name service should InterNIC fail. Elizabeth Hanes Perry email@example.com
Mike and Shelly Steen of Santa Rosa CA made a deposit of $3700 to their Bank of America account. The deposit was credited as $37,000. When they repeatedly tried to convince BoA there had been an error, they were told that that it could not have been a mistake that size because it would have been caught by the bank's verification system. [Thanks to Glenn Story, who spotted this item in the Palo Alto Daily News, 19 Mar 1997, p.28.]
There was a small fire and subsequently a small explosion at a Japanese nuclear waste processing plant in Ibaraki prefecture, north east of Tokyo. You can probably read the latest info on http://www.asahi.com (a web site of Asahi Shimbun newspaper. I just checked this and information is available in English, too. It carried the ominous news that a very tiny amount of Cesium 137 was observed in a nearby observation facility. You may not want to keep a bookmark of a particular article you saw at Japanese newspaper sites, though. For reasons unknown to me, they tend to recycle the same file name for different articles and throw away the old ones. So in a few days, the same URL often points to totally different news.) Before getting to the core of the story, I have to explain something I learned from security management. I have been managing office internet firewall for sometime, and learned a great deal from the book, Internet Security and Firewall, by Bellovin and Cheswick. Some of the useful tips in the book was to log copious amount of info for analysis and to make sure that the log is available for later analysis after incidents. Technique for this includes recording in a safe machine within internal LAN, recording the log on a write once media, or even using a PC with a network card so that no one can "break" into the machine via network to tamper with the system log. These will make sure that log is tamper-proof and accessible after incidents occur. Well, why am I saying this in relation to the accident at the processing plant? In the morning newspaper the day after the accident, I read that the plant managers could not figure out immediately if radio active material was released into the surrounding immediately. This is such an important thing and I was incredulous. Reading on, I found out this happened because of the following design and events: - There are four radio activity sensors around the building to monitor the radio activity level in the surrounding to see if radio active material escapes the sealing of the building. I think that it is a good thing they have four such meters. - But, these monitoring data are gathered and displayed in the monitoring room of the processing plant building, but *only* there. - Because of the small fire and the smoke that erupted, the operators had to be evacuated, and after the small explosion (most likely an incident where the high-pressure within the sealed confinement of the building punched holes via weak windows, doors and such) that took place hours after the initial fire, nobody could return to the control room to check out the meters. So the four meters were running to check the radio activity levels and sent data to the control room, but nobody could read it. I think extra sensor devices were brought in when the deficiency was realized by the top brass at the plant. I believe the designers of the processing plants took note and plan a revision of their monitoring systems very soon now. After writing all this, I am planning to do some sort of dry run to see if my firewall system can be restored in a quick manner if a disk drive is disconnected and such. There is a Japanese saying that goes something like, "Correct yourself by looking the behavior of others". I took due note of it. (As far as the incident goes, it is rumored to be level 3 in the nuclear incident scale in which 1 is the lightest and the worst is level 8. Chernobyl was level 8 and Three Mile Islands was level 5. Because of the expanding warm gas of the fire and smoke and such, the air conditioners to keep the internal pressure lower than external (to prevent leakage) could not keep running and when they failed, adjacent rooms got contaminated one by one. One incredibly stupid thing that happened was that apparently the initial small fire probably caused by the high-temperature asphalt was not put out completely and led to the small "explosion" that tore holes in the supposedly sealed walls. Coupled with the deliberate lies that were aired to the public when a nuclear breeder reactor Monju had a major accident of releasing hot melting Sodium outside its secondary cooling system, the half public/half private corporation which controls the nuclear power reactors research and waste disposal is under heavy media criticism in Japan right now. Their inept handling of PR this time around by inexperienced engineer turned PR people added fuel to the criticism. I think we can learn a lot from efficient American PR people here: I am saying this with a tongue in cheek.)
Regarding the slippery handling of private information such as one's signature information by US Postal Service (or was it Post Office), I often ask myself whether it can happen to us in Japan whenever I read such articles in RISKS. Well, just this morning Mar 18, I read in Asahi Shimbun newspaper (Yokohama edition) that Japanese Postal Service will offer tracking services for some special services such as registered mails and such staring June 1st via network. That is a good news. I think they are motivated by the success of FedEx and other private courier services. But I was thrilled after reading some details. According to the article, customers can type in the number assigned to the package to find out where it is going and other pertinent information: I mean if I read the article correctly, it seems that the system gives out the information of the addressee(!) WITHOUT any authentication whatsoever! I wish I am mistaken here. Maybe the number itself has some extra field that authenticate the validity of the number as known only to the holder of the assigned number? In any case, I can't wait to mistype a digit/letter of an assigned number to MY package to see if it will print out someone's supposedly private info. (I checked the Ministry of Post and Telecommunication's web page for more info: http://www.postal.mpt.go.jp/ Unfortunately, I could not find the details about the tracking service although the page has the mailto: to solicit a catchy name for this new tracking service from the public.) Chiaki Ishikawa, Personal Media Corp., Shinagawa, Tokyo, Japan 142 firstname.lastname@example.org or Chiaki.Ishikawa@personal-media.co.jp
I recently received this notice from a friend: I just downloaded Netscape Navigator 4.0 preview release 2. At long last, Navigator has an option that will block all cookies without popping up a warning for each one. It is in Edit->Preferences->Advanced. It seems to work properly, too! Curiously, when I installed it it ignored my current setting (Always warn before accepting a cookie) and set me up with "Always accept cookies." Users upgrading should be aware of this. Faculty of Medicine, School of Occupational Therapy, Hebrew University / Hadassah Hospital P.O. Box 24026, Mount Scopus Jerusalem 91240 Israel
At <http://www.efsl.com/security/ntie> a description may be had of YASH in MS-IE, this time involving the silent disclosure of user/"domain"/machine identity info and a transform of the user's "domain" password which could be used for false-presence attacks or offline cracking. Mark S. (Disclaimer removed)
Amos Shapir notes that many COBOL programmers use "nonsense words for variable names, to avoid reuse of a reserved word." It seems to me this is a problem principally for badly documented programs -- in my experience, the majority. It's hardly limited to Y2K problems. Yesterday I talked with someone who told me about a crucially important program his company uses: it's years old, is utterly undocumented, and was created by a brilliant programmer who was with the company for many years until recently he committed suicide. Now, and only now, they're trying to figure out what to do about the program. Oh, yes: undocumented and written in assembler code. I wish them well. Pete email@example.com
It's worth nothing that at least some banks are attempting to get a handle on this problem, now that cards expiring in 2000 and beyond are actually appearing... Wells Fargo, for example, sent a letter to merchants requesting that they attempt a particular "fictitious" transaction, and noted what the various return codes would indicate. In case of a year 2000 problem, the merchant is supposed to contact the bank and eventually receive (for a fee, of course, in most cases) firmware/software upgrades. Naturally, this doesn't do anything right away for the folks who already have the dreaded "00" expiration cards during this early phase of the transition period. --Lauren-- Moderator, PRIVACY Forum www.vortex.com
Phone companies are pretty good about catching lines that are disconnected. More likely this is a life-safety regulation. In Massachusetts the telephone company can not turn off dial tone on disconnected service because of the risk to life safety. These 'disconnected' phone numbers can only call 911, the operator, and the NYNEX business office (for new service). Bill
In RISKS-18.90 Dewi Daniels describes a problem where calls to Guyana were billed to his telephone account, apparently in error. This neatly dovetails with a report on the BBC TV programme "Watchdog" which showed that a number of British Telecom customers had suffered similar problems. Although the reason was never fully established, one conjecture was that BT employees with access to calling card details had used them illegally to call overseas premium-rate phone-sex lines. Jon jonsg@harlequin_co_uk See http://pobox.com/~jonsg/junkmail.html
The following is condensed from: FTC: Public Workshop on Consumer Information Privacy; thanks to a pointer in news.admin.announce by "russ-smith" of http://www.consumer-info.org/ Written comments (paper/disk) are due by April 15 for the June 10-13 workshops in Washington, DC. ========================= SUMMARY: The Federal Trade Commission has determined to hold a public workshop devoted to consumer information privacy. The workshop will be divided into three sessions. Session One is intended to gather information as part of a Commission study of the collection, compilation, sale, and use of computerized data bases that contain what consumers may perceive to be sensitive identifying information, often referred to as "look-up services." These data bases typically are used to locate individuals or develop individual background information. ... Session Two will address recent developments in the collection, compilation, sale, and use of personal information online generally, including self-regulatory efforts, technological innovations, and unsolicited commercial e-mail. Session Three will address the same developments as they pertain to children's personal information.
[See RISKS-18.69 for earlier message. PGN] FINAL CALL FOR PAPERS, NEW SECURITY PARADIGMS '97 A workshop sponsored by ACM and the University of Newcastle upon Tyne. Langdale Hotel, Great Langdale, Cumbria, UK 23 - 26 September 1997 More information will be provided on-line as it becomes available. E-mail to: firstname.lastname@example.org use anonymous FTP from: ftp.cs.uwm.edu in directory: /pub/new-paradigms Use World Wide Web from: http://www.cs.uwm.edu/~new-paradigms
Please report problems with the web pages to the maintainer