Previous Politech message: http://www.politechbot.com/p-04533.html --- Date: Mon, 10 Mar 2003 15:32:29 -0800 To: declanat_private From: Will Doherty <dohertyat_private> Subject: Re: FC: Bruce Taylor on library filtering case: Justice Dept won? Cc: dohertyat_private In-Reply-To: <5.1.1.6.0.20030310173126.0172e808at_private> Mime-Version: 1.0 Content-Type: text/plain; charset="us-ascii"; format=flowed Dear Declan and Bruce (and politech readers if this is forwarded), Keyword-based filters often block search engine results. Even if search result listings are not blocked, there is often not enough information available in the brief search result listing to determine whether or not the search result should have been blocked and many search results are in fact inappropriately blocked for many different logical and entirely unpredictable reasons. At least one filtering vendor blocked its own website front page. You can see examples of such behavior in the Online Policy Group's "Online Oddities and Atrocities Museum" under construction at http://www.onlinepolicy.org/research/museum.shtml (your submissions to the museum are welcome). Internet filtering products simply don't and can't work since they aren't capable of interpreting human language in context. School children in the United States are going to have Swiss cheese for brains due to the randomly inaccessible information they will miss from their education due to Internet blocking. This will be true even if the U.S. Supreme Court upholds CIPA for libraries, since schools weren't covered in this legal challenge. One Internet with Equal Access for All, Will Doherty Online Policy Group, Inc. http://www.onlinepolicy.org ------- --- To: declanat_private Cc: politechat_private, BruceTaylorat_private Date: Tue, 11 Mar 2003 02:24:44 -0500 Subject: Re: FC: Bruce Taylor on library filtering case: Justice Dept won? From: terry.sat_private > out which ones the filter will block. There are no "secret" blocks > by filters. A filter always tells you when you can't see a site, but > you have to ask to see the site before the filter is asked if it's blocked. To the extent there's bias in blacklists, and many vendors treat those lists as proprietary secrets, librarians cannot be aware of the patterns in censored content, while citizens cannot inquire and discover same to review for possible biases, short of obtaining and hacking encryption in those elements of such software, or doing massive amounts of testing for blocked URLs. Every blacklist used in a tax funded system should be FOIA available. That includes whitelists, since creative bigots have been known to claim they don't deny blacks access to busses or lunch counters, but only allow whites. > Analogize to library card catalogues: > > A library has a white card for every book they have. > The card catalogue doesn't have a card for every book in print on a > topic and doesn't have a card for the books they don't have. > When users go to the card catalogue, we only learn which books the > library has on a topic, not all the other books ever printed on that > topic. > If they had a white card for the books they have and a red card for > the ones they don't, at least patrons would know what else exists on > a topic. In past history, this might have been a valid analogy. Due to shared databases, the move from paper card catalogs to searchable electronic ones, interlibrary loans, and access to additional databases such as "books in print" from library computers, modern libraries do have in effect three sets of cards, actually more inclusive than what Bruce's model suggests. They have those "white cards", "red cards" for what's available from other libraries on loan, constituting both in print and out of print materials not always easy to purchase if one so chose, and "yellow cards" for other books which the instant or some other library might or might not purchase if requested, or which might be otherwise available to the patron to buy or borrow from other sources. That changed reality of modern library operations makes Bruce's model flawed in other ways. It costs staff labor and transportation to operate interlibrary loan systems. Presumably were shelf space and acquisition costs not issues, most general use libraries would have 100% of items on the "red" list, additional copies of popular works on the "white" list, and many if not most items from the "yellow" list as well. Web sites equivalent to those lists have zero incremental budget or space requirement to carry multiple copies of popular works and all other patron desired materials. In effect that means the net enables expanding the "white" list, adding the "red" list at lower cost than interlibrary loans (reversing traditional cost models), and simply making every item on the "yellow" list available for less cost than staff time alone that would be required to review patron requests for collection additions in print form. The only forms of libraries which traditionally try to avoid having expansive collections bounded only by cost and space are topical focus libraries. Those might be found in a university science or arts department, a court house, or an elementary school where reader levels 10-14+ would be wasted while complicating the tasks of users finding useful materials easily. These changed cost models offer additional opportunities to remedy past issues of discriminatory practices, which in turn result in absurdities under Bruce's model. Most US libraries carry almost exclusively English language works. Some carry Spanish or other language books based on significant local community constituencies. Most library patrons with interests in minimally represented languages are left to fend for themselves. This economic model of not designating a legal national language (with all the religious and nationality biases that would carry) while catering to more prolific minorities treats members of smaller minorities as if less than citizens. The virtually infinite expansion of shelf capacity without acquisition costs allows carrying every available foreign language version of every book in the catalog or loan or in print systems, plus allows near instant translations (albeit with defects) to nearly any other language. No longer would people whose first languages were Asian, African tribal, Portuguese, French, German, Hebrew, Arabic, etc. expect to find versions in their primary languages absent. This raises an interesting issue as to most porn sites (whatever that is, with tribute to Potter Stewart's respect for the inability of US law to promulgate a neutral and concrete definition). Were we to tolerate English and possibly other prevalent language based filters to impose Abrahamic religious prejudices backed by defective past Supreme Court rulings demeaning sexuality, would English speaking citizens be denied access to types of sites which persons working in less common foreign languages might find not blacklisted? What biases would be present in a Jain friendly filter, as a religious perspective which goes beyond values of most pacifist Earth friendly sexuality-positive Neopagans, but with an ascetic side more aligned with Calvinist religions? Depiction of violence would clearly be offensive, whether advocacy of war, the artifacts of "War on Select Drug Rights" black markets, or the killing of freshly plucked vegetables, never mind animal species of any kind. Raid insecticide ads would be gone. Florida oranges or milk might pass, while Gutrot Arches or brocoli would be out. Cars and plastic goods would be gone, as would ads for clothes if the ideals of digambaras Jains were respected. As such, images of humans wearing clothes might be seen as offensive, though only the mendicants (clergy) of that sect actually vow to not own clothes. Sexuality might be evaluated as to its healthy in balance and sometimes spiritual forms, and depicted openly. However, if we look to tantra or common Neopagan methods to determine what's healthy sexuality or not, that's an individual balance issue, where what's out of line is any third party imposing his judgement in place of the individual's guidance to his own path. That brings us full circle, to recognize that there are no neutral criteria for content based censorship. Every possible blacklist is based on popular prejudices inclusive of religion, ethnicity, artistic ideas, or political viewpoint, and so fails the basic tests of Cohen v CA or Hess v IN. Had our courts been entirely honest, Miller v CA would have collapsed as of 1868 (14th extending 1st inside states), and Pacifica indecency would have been treated as anyone's right to interpret for himself outside the US legal system, just as Justice Stewart suggested for "pornography", not established in precedent in the 1970's. The only rational conclusion is that whether out of economic or political power greed, or prejudices of zealotry, the core intent of every possible censorship effort, much like Secondary Effects Doctrine, is to find facially legal excuses for blatantly illegal discrimination. It costs a library less for a patron to learn about "making sparks" with dripping wet pink pussy online, than it costs to stock a spare Boy Scout Manual that teaches about what for many are less exciting sparks. Terry "Censorship is the most insidious form of hate speech." ------------------------------------------------------------------------- POLITECH -- Declan McCullagh's politics and technology mailing list You may redistribute this message freely if you include this notice. To subscribe to Politech: http://www.politechbot.com/info/subscribe.html This message is archived at http://www.politechbot.com/ Like Politech? Make a donation here: http://www.politechbot.com/donate/ ------------------------------------------------------------------------- Declan McCullagh's photographs are at http://www.mccullagh.org/ -------------------------------------------------------------------------
This archive was generated by hypermail 2b30 : Tue Mar 11 2003 - 07:56:32 PST