RISKS-LIST: Risks-Forum Digest Tuesday 27 January 2004 Volume 23 : Issue 14 FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS (comp.risks) ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator ***** See last item for further information, disclaimers, caveats, etc. ***** This issue is archived at http://www.risks.org as http://catless.ncl.ac.uk/Risks/23.14.html The current issue can be found at http://www.csl.sri.com/users/risko/risks.txt Contents: [Seriously backlogged, spammed, and e-mailed with viruses] Spirit Rover humbled by classic programming error (Robert Woodhead) New virus infects PCs, whacks SCO (Monty Solomon) Panel reports DoD SERVE System fatally flawed - bureaucrats in denial (Scott Miller) Roadside camera claims car going 406 mph (greep) The risks of naming (Ross Anderson) "Outsourced and Out of Control" (Lauren Weinstein) Pun-intended definitions (PGN) UK data protection laws and the Law of Unintended Consequences (Richard Pennington) Lie-detector glasses, 90% accurate? (Steve Holzworth) DHS protects vendors of anti-terrorism technologies from liability (Jay Wylie) Privacy & security threats in one (Jeremy Epstein) Rob Slade's review of Marcus Ranum's *The Myth of Homeland Security* (Marcus J. Ranum) Proceedings on ... Engineering Principles of System Security ... (Daniel P. Faigin) Abridged info on RISKS (comp.risks) ---------------------------------------------------------------------- Date: Tue, 27 Jan 2004 08:31:46 -0500 From: Robert Woodhead <trebor@private> Subject: Spirit Rover humbled by classic programming error Is it just me, or is it truly ironic that the Spirit Rover (now, thankfully, on the road to recovery) was brought down by a variant of the classic "fixed length buffer" error? See: http://spaceflightnow.com/mars/mera/040126spirit.html Even on Mars, it seems, the RISKS are obvious. And clearly, the Spirit's designers had learned some lessons from previous space experience, and were not too proud of the technological terror they had created -- they had a way to boot into the monitor, so to speak. Woodhead's Law: "The further you are from your server, the more likely it is to crash." (particularly appropriate in this case) [There was an item on the radio news about the computer having tried to reboot something like 60 times. Next time we'll have to send gifted SysAdmins up with the rovers? PGN] ------------------------------ Date: Mon, 26 Jan 2004 22:21:56 -0500 From: Monty Solomon <monty@private> Subject: New virus infects PCs, whacks SCO Robert Lemos, CNET News.com, 26 Jan 2004, 5:58 PM PST A mass-mailing virus quickly spread through the Internet on Monday, compromising computers so that they attack the SCO Group's Web server with a flood of data on Feb. 1, according to antivirus companies. The virus--known as MyDoom, Novarg and as a variant of the Mimail virus by different antivirus companies--arrives in an in-box with one of several different random subject lines, such as "Mail Delivery System," "Test" or "Mail Transaction Failed." The body of the e-mail contains an executable file and a statement such as: "The message contains Unicode characters and has been sent as a binary attachment." http://news.com.com/2100-7349-5147605.html [Oodles of other URLs omitted. PGN, who has been wading through hundreds of extra messages today. GRRROAN.] ------------------------------ Date: Thu, 22 Jan 2004 14:17:58 -0500 From: Scott Miller <SMiller@private> Subject: Panel reports DoD SERVE System fatally flawed - bureaucrats in denial A four member panel (out of a 10-member peer review group) has condemned the Pentagon's Secure Electronic Registration and Voting Experiment for inherent and irreparable security flaws in the public computing infrastructure. "I think that a dedicated and experienced hacker could subvert the election rather easily..." - Dr. Aviel D. Rubin, technical director of Johns Hopkins' Information Security Institute. "The only 100% way we can avoid some of the security issues [raised by the four panel members] is to not do this. And that is not something we will do..." - Glen Flood, a spokesman for the SERVE project. Computerworld article - http://www.computerworld.com/securitytopics/security/story/0,10801,89290,00.html Panel report - http://www.servesecurityreport.org/ ------------------------------ Date: Wed, 21 Jan 2004 12:23:39 -0800 From: greep <greep@private> Subject: Roadside camera claims car going 406 mph This is excerpted from the *Sun* http://www.thesun.co.uk/article/0,,2-2004031766,00.html): Driver Peter O'Flynn was stunned to receive a speeding notice claiming a roadside camera had zapped him -- at an astonishing 406MPH. The sales manager, who was driving a Peugeot 406 at the time, said: "I rarely speed and it's safe to say I'll contest this." Officials admitted it was a clerical bungle, but insisted he would still be prosecuted. (The Peugeot 406 Sport has a top speed of 129mph.) ------------------------------ Date: Fri, 23 Jan 2004 17:55:37 +0000 From: Ross Anderson <Ross.Anderson@private> Subject: The risks of naming Regular RISKS readers know that many things can go wrong with naming, and affect systems that use ID cards, PKIs and suchlike. But this morning I came across a new and quite surprising failure mode. I suddenly learned that I did not know how to spell my own name! Recently, we had to manufacture a version of my name in Korean characters (Hangul) for a Chinese new year card. A local Korean scholar duly assisted and off went the card. This morning, I was tipped off by one of the recipients that my name was `wrong'. It turns out that people in Korea who work with information security have arrived at a consensus on the Hangulisation of my name, as indeed they have for many other foreign computer science researchers. It turns out that I'm not 'Los An-del-son', as my informant had suggested; `everyone in Korea' knows me as 'Lo-ssue En-da-son'. So there we have it. You may be well-known by a name you never knew you had. I expect there was no way a Korean who was unaware of the consensus could have second-guessed the spelling. So that's what it's like to be called Gaddafi / Ghazzafi / Qadhafi! Meanwhile, on Wednesday, I went and got a visa for India, so now my passport has stuff in it in yet another script. And no doubt when I visited Japan there was at least one version of my name knocking about in at least one kind of kana. This underlines the risks of the consensus emerging among governments post-9/11, which is that people acquire names only because a government issues a birth certificate; and so governments need to build huge infrastructures of databases, biometrics and ID cards to support this vital social function of knowing people's names. I rather fear that, in our multicultural world, the task of making everyone's names correct and consistent might lie beyond our technical and organisational capabilities. ------------------------------ Date: Tue, 06 Jan 2004 09:03:47 -0800 From: Lauren Weinstein <lauren@private> Subject: "Outsourced and Out of Control" Since the topic of outsourcing is of great concern currently, I've made available a copy of my "Inside Risks" column that will appear in the upcoming February 2004 edition of "Communications of the ACM" (CACM). It is titled "Outsourced and Out of Control" and is located at: http://www.pfir.org/outsourced-cacm As the column discusses, while the issue of job losses is serious enough, other factors, such as privacy and security risks, also need to be considered! Lauren Weinstein, lauren@private 1-818-225-2800 http://www.pfir.org/lauren Co-Founder, PFIR - People For Internet Responsibility - http://www.pfir.org Moderator, PRIVACY Forum - http://www.vortex.com ------------------------------ Date: Tue, 6 Jan 2004 14:05:15 PST From: "Peter G. Neumann" <neumann@private> Subject: Pun-intended definitions The Sunday *San Jose Mercuri* (4 Jan 2004) had a wonderful article on the 50 best punny definitions of the year. Here is a sampling of a few with computer technology relevance. off-shorn: vt. Getting cut because your job moved overseas. [Rainer Richter, San Jose] Microsofa: n. A piece of furniture that, while it looked fine in the showroom, gradually begins to dominate the living room, eventually forcing you to replace all the other furniture, including the TV, to be "compatible". [Earl T. Cohen, Fremont] motherbored: n. In many homes, a technology discussion at dinner between father and the kids. (Bruce Kerr) Luddate: n. Someone you are going out with who does not understand the [Santa Clara] Valley's obsession with technology. (Lisa Lawrence, Palo Alto) Crisco: n. A person who got fried by buying Cisco at $80 a share. (Jim Schutz) ------------------------------ Date: Sun, 18 Jan 2004 21:04:05 +0100 From: Richard Pennington <richardhelen.pennington@private> Subject: UK data protection laws and the Law of Unintended Consequences There are two cases causing a stir here in the UK where mis-interpretation of the UK Data Protection Act has been blamed for serious unintended consequences, including loss of life. Case 1: A school in Cambridgeshire (UK) advertised for a new caretaker (for US readers, read 'janitor'). Because of child protection legislation, a routine criminal record check was performed by local police on the successful applicant. Because the applicant had previously lived in another area (Lincolnshire), the local police, as a matter of routine, contacted Lincolnshire police and received a 'clean' report; a similarly 'clean' report was then given to the school, who confirmed the appointment. The caretaker later murdered two of the schoolchildren (aged 9 and 10). The resulting inquiry revealed that the caretaker, while in Lincolnshire, had been the subject of multiple relevant allegations (indecent assault and worse), none of which had ever been brought to court. Lincolnshire police claimed that, under the (UK) Data Protection Act, they were obliged to destroy the records of the alleged offences when the investigations ended without a trial. As a result, the various investigations in Lincolnshire never heard about each other, and none of the information was forwarded to Cambridgeshire. Case 2: Responding to an unpaid gas bill, British Gas disconnected the gas supply from an elderly couple in August 2003 (at which time the weather was extremely hot), and did not notify the local Social Services. The few months later, the couple were found in their apartment, dead from hypothermia (the weather now being much colder). British Gas claimed that (under the UK Data Protection Act) they were unable to contact local Social Services because they did not have the written permission of the couple to disclose their financial records. In both cases, the UK Data Protection Registrar (the official in charge of information protection and privacy) has indicated that the official bodies involved) misunderstood the meaning and intent of the legislation, despite existing guidance. The guidance is now in the process of being rewritten and clarified. But, in these two cases, four lives were lost. ------------------------------ Date: Tue, 20 Jan 2004 17:33:11 -0500 From: Steve Holzworth <sch@private> Subject: Lie-detector glasses, 90% accurate? Starkly excerpted: http://www.eetimes.com/story/OEG20040116S0050 It may not be long before you hear airport security screeners ask, "Do you plan on hijacking this plane?" A U.S. company using technology developed in Israel is pitching a lie detector small enough to fit in the eyeglasses of law enforcement officers, and its inventors say it can tell whether a passenger is a terrorist by analyzing his answer to that simple question in real-time. ... The company showed plain sunglasses outfitted with the technology at the 2004 International CES in Las Vegas earlier this month. The system used green, yellow and red color codes to indicate a "true," "maybe" or "false" response. At its CES booth, V Entertainment analyzed the voices of celebrities like Michael Jackson to determine whether they were lying. ... "It is very different from the common polygraph, which measures changes in the body, such as heart rate," said Richard Parton, V's chief executive officer. "We work off the frequency range of voice patterns instead of changes in the body." The company said that a state police agency in the Midwest found the lie detector 89 percent accurate, compared with 83 percent for a traditional polygraph. [SCH - oh, excellent! I only have a 1 in 10 chance of being falsely accused.] Steve Holzworth, Senior Systems Developer, SAS Institute - Open Systems R&D VMS/MAC/UNIX Cary, N.C. sch@private ------------------------------ Date: Sun, 18 Jan 2004 14:11:53 -0500 (EST) From: Jay Wylie <jwylie@private> Subject: DHS protects vendors of anti-terrorism technologies from liability [This note considers an] article "Guarding Against Terrorism--And Liability" by Roland L. Trope in the January 2004 issue of *IEEE Spectrum*. The article gives some details about the SAFETY (Support Anti-terrorism by Fostering Effective Technologies) Act of 2002. The act protects vendors of anti-terrorism products that have been vetted by the Department of Homeland Security and designated as QATT (Qualified Ani-Terrorism Technology) from liabilities that arise from any failings of the anti-terrorism technology. Specifically, the DHS determines a level of insurance that must be carried, and this level caps the liability of the vendor. Seeing the quality of software that is produced in an environment in which software vendors are free of liability, I am concerned about the quality of products generated under such protection as offered by SAFETY. As well, the color-coded threat levels of the DHS does not give me much confidence in their ability to evaluate technologies that offer protection from terrorism (security is complicated, colors are not). Most disappointing though, is that a publication of a professional society of engineers is more concerned about vendors being aware of the protection from liability than questioning whether such protection is appropriate. P.S., More info on SAFETY can be found at www.safetyact.gov, but the secure web site certificate is not configured properly... ------------------------------------------------------------------------- Date: Wed, 21 Jan 2004 12:33:59 -0500 From: Jeremy Epstein <jeremy.epstein@private> Subject: Privacy & security threats in one I've recently joined LinkedIn, which is one of the crop of electronic meeting places for making business contacts based on six degrees of separation. I'm pretty suspicious of the privacy issues, and therefore won't tell them anyone I know about, unless I see that they're already members. That obviously limits the size of my circle, but gives me (and my contacts) more privacy. To boost the number of members, they recently sent out a message encouraging people to use a new feature: it will compare everyone in your Outlook contact list to the list of members, and tell you who you know about who's already a member. The tool is an ActiveX control (since I don't run IE, I don't know if it's signed or unsigned). There's no warnings on their site about the dangers of running code of this sort... but there is a note "When you start importing, you will see a security warning similar to this one: [image of an ActiveX control approval box] Simply press the "Yes" button to agree to the upload." So they're encouraging you to risk both your privacy *and* your security in one easy step. Thanks but no thanks. ------------------------------ Date: Mon, 26 Jan 2004 20:20:28 -0500 From: "Marcus J. Ranum" <mjr@private> Subject: Rob Slade's review of Marcus Ranum's *The Myth of Homeland Security* [First of all, an explanatory note. In deciding to run Rob Slade's review of Marcus Ranum's book, *The Myth of Homeland Security*, Wiley, 2004, I thought that -- because of the nature of the topic -- it would be appropriate to give Marcus an opportunity to respond. Because he has completely encapsulated the content of Rob's review in his commentary, I decided to avoid duplication and just run the response with the review interstiated (and prefaced by "> "). I hope that is a reasonable strategy for RISKS readers. PGN] First off, let me thank Rob for being so kind as to review my book in this forum. If there's "no such thing as bad publicity" I suppose I must be grateful. :) I'd like to take the liberty to comment on a few aspects of the review. > Chapter one asserts that Homeland Security is (along with a number of > other similar terms) a convenient invention. Information warfare is > derided as such a device, and although I could agree in terms of books > such as Erbschloe's (cf. BKINFWFR.RVW), I don't think Ranum gives > enough thought to the work by Dorothy Denning (cf. BKINWRSC.RVW). My book was aimed at a popular audience - the intended victims of the Homeland Security scam - rather than at computer security professionals who are familiar with Dr. Denning's books. My object in the book is not to engage in a debate of scholarship as much as to point out some of the obvious bogusness that is being put about in the popular media. As I express in my book, even the "serious" Information Warfare proponents are guilty of using it as a FUD-vehicle to sell their services and products, and completely ignore many serious flaws in the concept - such as the problem of logistics as applied to "cyberweapons." Simply saying that my book doesn't pay adequate attention to Denning should not justify dismissing it. > He is also seemingly inconsistent in his positions, arguing > generally against biometrics and profiling, but then apparently > endorsing them. You must have skimmed that section. :) I pointed out that biometrics actually wouldn't have prevented any recent terrorist incidents, though widescale deployment of biometrics would have been vastly beneficial to the vendors of said systems. ;) If you consider that an "endorsement" it's an endorsement of faint praises. :) I was rather dismayed to see that, in another gesture of homeland security grandstanding, the biometrics passport hype has managed to gain some momentum. Nobody in the homeland security arena seems to be able to address the problem: so what if you know WHO the person is, how do you know what kind of person they are? > The arguments are not reasoned: he is for a national identity system, but > admits elsewhere that the 9/11 terrorists had valid identification. I am NOT in favor of a national identity system. I DO think it is ridiculous to have 50 states issuing ID based on 50 different trust criteria, using 50 different types of alteration-resistance technology, and with no way of checking to see if they're actually real without calling that state's Department of Motor Vehicles. The issue I was trying to point out is that, in such an environment, it's silly to be requiring ID before letting someone on a plane. Does that make me in favor of a national identity system? My point is that if we're going to go down this route we may as well get it right and stop trying to slap half-arsed measure atop half-arsed measure. > Chapter seven says that the army is good, the border patrol is looking for > the wrong things (although this is confusingly amended to a position that > they have the technology but aren't using it), and the FBI and CIA have an > ongoing turf fight. I understand the need to make your review more entertaining by being flippant - I did NOT say "the army is good" and you know it, if you actually read the book. Chapter seven focuses on the fundamental reasons why military approaches to security, law enforcement approaches to security, and intelligence approaches to security will all be different and are, in fact, highly incompatible. That's a much different discussion from "the army is good" and you know it. :) As someone who reviews books for publishers myself, I know the importance of not allowing my personal feelings about a book to influence how I present its contents. I would have no problem with your review if you kept it fair (e.g.: "Ranum tries to make a case that the FBI and CIA will be unable to ever converge on a cooperative approach to counterterror and fails because blah blah blah" but jocularly summarizing aspects of a writer's thinking as you do above is more appropriate for a music review in a punk rock magazine than a review in a technical mailing list. > Cyberattacks are an unreal myth, says chapter nine, but our > information infrastructure is mostly undefended. The lack of > standardization in government systems is seen as making government > systems harder to defend (even though homogeneity means that a single > attack can penetrate everything). Note: the parenthetical above is the reviewer's opinion, not a position I take in my book. My focus on the feebleness of government security is from a viewpoint of manageability, technical competence, and lack of standardization, not from a "monoculture" hype perspective. > While this material starts off very well, possibly due to Ranum's greater > familiarity with strictly technical issues, he makes numerous errors in > regard to viruses and malware. His lack of experience in this specific > area reappears in chapter ten, where he says that even outdated antivirus > scanners should have caught Code Red because the exploit was a known one. I am really annoyed by this part of your review. Never mind that you're obviously the malware guru, but in this forum you're pretending to be a book reviewer. As such, it's not proper to mis-characterize a book to suit your ends. I said NOTHING like what you attribute to me, above. My comments on CodeRed in specific and malware in general were broad and read: "When CodeRed broke out, my company's system administrator knew about it within an hour, had verified that our systems weren't vulnerable to it, and had gone on to doing something else. We weren't vulnerable to CodeRed because CodeRed relied on a vulnerability that the security community had known about for the past three months, and that had been fixed by most diligent system administrators." Does that say something about even outdated virus scanners? I also touched on malware briefly when I wrote, about CodeRed and Slapper (malware in general) "What the public at large may not realize is that these attacks usually only disrupt organizations that have failed to take even basic precautions to protect themselves." This is not "out of date antivirus scanners" I am referring to here, but the combination of reasonable A/V policy, boundary attachment scanners, firewalls, etc. These techniques DO work and they work well. Organizations that get reamed by malware and worms are frequently trying to "have their cake and eat it too" from a security perspective and prefer to blame others rather than their own inability to follow simple best practices. I am disturbed that you'd so seriously mis-characterize a book you're reviewing in a public forum like this. Did you actually read my book before you wrote your review? If I send you a copy will you read it, please? > However, scanners would not have caught Code Red since it did not > write itself out to a file, and also because scanners search for > strings or patterns, not exploits. (If anything should have caught > Code Red it was more likely to have been the firewalls that Ranum has > made his name in designing.) I'm just amazed that you're taking off on this point. The only reference to CodeRed in Chapter nine are both on page 141 and nowhere do I say anything about A/V scanners. Did you actually read Chapter nine, or are you perhaps confusing my book with another? [Actually, the specific mitigation we had in place at NFR that made CodeRed a non-event was our proxy firewall, as you guessed.] > Those of us who work in the security field can certainly sympathize > with the tone of Ranum's work. Yes, governments (and businesses) are > foolish. Yes, the general public sees a complex problem in simplistic > terms. Yes, you can find instances of stupidity in any large > enterprise. But does any of this have a real bearing on how security > can be improved, or how we should look at it? When the general public sees a complex problem in simplistic terms and is being sold trumped-up stupid solutions for nosebleed prices it is QUITE useful to point it out to them. That was the purpose of my book. Improving security and how we should look at it was not the purpose of the book. My entire career has been spent on the latter issues, working with my peers to do what I can from a technical standpoint. The purpose of this book was not to teach Joe Sixpack how to design a trust model; it was to help Joe Sixpack understand why he's being asked to spend $30 billion and would have probably gotten more use out of the money if he'd had a good bonfire with it. > (Particularly to a non-American audience, this book must read like a long > string of sometimes whiny complaints.) Yes, Ranum starts off by saying > that he is not actually offering solutions, but that bald statement hardly > absolves him of not offering anything, including insights. I can only hope my book offers some insights. I could swear I thought I put a few of them in there when I wrote it but I can't seem to find them, now... > Presumably, however, we are not the target audience, and the book is aimed > at demonstrating to the general public that Homeland Security is, as the > cover graphically puts it, a house of cards. Finally! Thank you! It's good that you put one thing in the review that I can wholeheartedly agree with! Yes, that's what this book was all about!!!!!! <sarcasm> Thanks for writing such a careful, perceptive, and fair review. </sarcasm> ------------------------------ Date: Fri, 23 Jan 2004 09:02:38 -0800 (PST) From: "Daniel P. Faigin" <faigin@private> Subject: Proceedings on ... Engineering Principles of System Security ... ACSA has announced the availability of the electronic proceedings for the ACSA-sponsored Workshop on the Application of Engineering Principles to System Security Design (WAEPSSD) at http://www.acsac.org/waepssd The goal of the workshop was to examine engineering fundamentals, the principles and practice of designing and building secure systems. The engineering principles identified by the workshop as most beneficial to apply to security systems are presented in the two group reports. The proceedings also contain the workshop position papers, notes from the chair and editor, and list of contributors and organizers. Daniel Faigin, ACSA Secretary, Chair: ACSAC 20 (see www.acsac.org) ------------------------------ Date: 7 Oct 2003 (LAST-MODIFIED) From: RISKS-request@private Subject: Abridged info on RISKS (comp.risks) The RISKS Forum is a MODERATED digest. Its Usenet equivalent is comp.risks. => SUBSCRIPTIONS: PLEASE read RISKS as a newsgroup (comp.risks or equivalent) if possible and convenient for you. Alternatively, via majordomo, send e-mail requests to <risks-request@private> with one-line body subscribe [OR unsubscribe] which requires your ANSWERing confirmation to majordomo@private . If Majordomo balks when you send your accept, please forward to risks. [If E-mail address differs from FROM: subscribe "other-address <x@y>" ; this requires PGN's intervention -- but hinders spamming subscriptions, etc.] Lower-case only in address may get around a confirmation match glitch. INFO [for unabridged version of RISKS information] There seems to be an occasional glitch in the confirmation process, in which case send mail to RISKS with a suitable SUBJECT and we'll do it manually. .UK users should contact <Lindsay.Marshall@private>. => SPAM challenge-responses will not be honored. Instead, use an alternative address from which you NEVER send mail! => The INFO file (submissions, default disclaimers, archive sites, copyright policy, PRIVACY digests, etc.) is also obtainable from http://www.CSL.sri.com/risksinfo.html ftp://www.CSL.sri.com/pub/risks.info The full info file will appear now and then in future issues. *** All contributors are assumed to have read the full info file for guidelines. *** => SUBMISSIONS: to risks@private with meaningful SUBJECT: line. *** NEW: Including the string "notsp" at the beginning or end of the subject *** line will be very helpful in separating real contributions from spam. *** This attention-string may change, so watch this space now and then. => ARCHIVES: http://www.sri.com/risks http://www.risks.org redirects you to the Lindsay Marshall's Newcastle archive http://catless.ncl.ac.uk/Risks/VL.IS.html [i.e., VoLume, ISsue] Lindsay has also added to the Newcastle catless site a palmtop version of the most recent RISKS issue and a WAP version that works for many but not all telephones: http://catless.ncl.ac.uk/w/r http://the.wiretapped.net/security/info/textfiles/risks-digest/ . http://www.planetmirror.com/pub/risks/ ftp://ftp.planetmirror.com/pub/risks/ ==> PGN's comprehensive historical Illustrative Risks summary of one liners: http://www.csl.sri.com/illustrative.html for browsing, http://www.csl.sri.com/illustrative.pdf or .ps for printing ------------------------------ End of RISKS-FORUM Digest 23.14 ************************
This archive was generated by hypermail 2b30 : Tue Jan 27 2004 - 16:41:08 PST