Saturday, January 30, 2010

Too Serious To Laugh

I'm not too sure what happened this week in the Culpeper County Public Schools in Virginia. A book was pulled, or it wasn't. It was done on purpose, or it was a mistake, or it never happened. It was because of sexual explicitness, or was it homosexual themes? The superintendent supports the action, or maybe not.

The book in question is quite a serious matter, Anne Frank's, Diary of A Young Girl: The Definitive Edition, the well-known account of a Jewish family in hiding in the Netherlands during the Nazi occupation. I remember reading the diary in eighth grade, which is around the age when most Americans read it. That makes sense, since the author was right around the age of 14 as she wrote.

Those of us who read the book prior to 1995 have missed something. The book we read was a little shorter, Anne Frank's father having withheld some details from the published edition. "The definitive edition" was the first time the unabridged diary was published in English, restoring the omitted sections.

And there's the problem. It seems that a few of the restored sections discuss sexual feelings, "including one where the young girl writes about discovering her vagina," as the Culpeper Star-Exponent put it. A parent complained about this, and at least some school officials tripped over themselves rushing to be seen as taking action. One school official, according to newspapers, stated that the book had been pulled. National and international ridicule and condemnation of the administration was swift and merciless. Then administrators tripped over themselves running in the opposite direction.

Removing the dictionary from a California classroom because it contained dirty words was laughably ridiculous. Challenging Buster's Sugartime in Tulsa for daring even to mention a same-sex couple was entertaining buffoonery. But pulling Diary of a Young Girl is not funny at all. It's evil.

Culpeper County School officials now say that pulling the book was a mistake. They have assured the media that the book is still available in the classroom. But it appears there is still going to be a "review" sometime this Spring.

See:

The Slippery Slope in Australia

I've said I support, with considerable limitations, the idea of internet filters on computers used by children. I've also been strident in opposing internet filtering for adults, in large part because I see such measures as teetering on the brink of a dangerously slippery slope. The federal government of Australia, in recent months, has taken huge steps in validating my fear: they are showing the world just how slippery that slope really is. The similarities between American and Australian cultures makes this move both surprising and horrifying.

Australian law already prohibits Internet Service Providers (ISPs) in that country from hosting what is called Refused Content (RC). ABC News, that's the Australian Broadcasting Corporation, describes RC as including "child sex abuse, bestiality, sexual abuse and detailed instructions for crime or drug use." These restrictions are, of course, justified by the aim of "protecting children."

The new twist, the start of the slide down that slippery slope, is in proposed new legislation that would require all ISPs in Australia to block RC that is hosted on servers outside of the country. Apparently, blocking would be done on the basis of a black list, a list of sites that the ISPs are supposed to block, said list to be provided by the federal government. This would not require a piece of software residing on every, or even any, user's computer. The prohibited sites would not be available to anyone, minor or adult, because the ISPs themselves, the companies that connect each individual user's computer(s) to the rest of the planet, will be blocking access to the prohibited sites.

Think about that a minute: the federal government will have the legal authority to create a list of prohibited websites, blocking anything in the world they choose to block, and to force all ISPs in Australia to go along with that blocking. In theory, the government would only place sites containing Refused Content on the black list. But nobody will be able to verify this, since the process of creating the list will be secretive, the actual contents of the list will be an official secret, and nobody will be able to check the list because all the sites on it will be blocked.

Of course, any such black list, even when created with the best of intentions, must have errors in it, sins of both omission and commission. This has already been demonstrated, since a preliminary version of the list was leaked to the press. The media have mocked with gusto the blocking of the websites of a "dentist and a truckshop consultant."

And who in their right mind would assume that any government, given the ability to block internet content with impunity, could resist abusing that power?

Similarly bizarre and totalitarian efforts have been attempted in the United States, such as the Communications Decency Act (CDA) and the Child Online Protection Act (COPA), both resoundingly overturned by the Supreme Court. At the moment we operate under the much tamer Children's Internet Protection Act (CIPA), binding only on libraries that accept federal funds, plus a patchwork of state-mandated and voluntary internet filtering efforts. None of these is so centralized or chilling as what Australia is now considering.

And yet, there is that same element here in the U.S. There seems to be no shortage of those who would pressure a library to accept government funds just to force that library to comply with the CIPA. Once the library is CIPA-compliant, there's always someone who appoints herself or himself sheriff, looking over other people's shoulders, and raising a ruckus if somebody is looking at a website the self-appointed sheriff doesn't approve of. And once the ruckus has been raised, there's always some dunderheaded councilperson or commissioner or school board member who will try to take advantage of the mess to impose further restrictions on what adults can access on the web. And once those restrictions are added on, there's always an "error," an"oversight," blocking access to something that should never be blocked.

Australian media are quite right to point out that the proposed law puts their country in the same league with China and Iran. And if it can happen in Australia, it can happen in the U.S.

See:

Friday, January 29, 2010

A Global Embarrassment

I suspect the administrators of the Menifee Union School District had no idea what they were stepping into.  It's not their fault that the authors of the Merriam-Webster's Collegiate Dictionary included a definition of the term "oral sex."  It's not their fault that nobody seemed to notice this for years on end, or that nobody made an issue of any of the other "racy" terms the dictionary manages to define.  It's not their fault that some parent finally decided to make an issue of it. In fact, the district is to be commended for giving the parent's challenge careful consideration, and is to be commended for making the correct decision to retain the dictionary.

In spite of these commendable actions, the school district is now a laughing stock. And not just in Riverside County or in California, it's now a laughing stock nation wide and even planet wide. This whole flap could have amounted to no more than an amusing anecdote about American culture, except for one little thing: the thick-witted yokels running the school actually pulled copies of the DICTIONARY from fourth and fifth grade classrooms.  Bad enough, but at least they made up for it by eventually getting some smarts and putting the books back, right?.  Well, not quite.  They had to add in accommodation for stupidity and insanity: parents can now decide whether or not their children will have access to the DICTIONARY in SCHOOL.

I've now seen news reports and scathing commentary about this in three languages and dozens of news outlets around the world.  Picked up by every major news outlet in the US, the story has since moved on to Canada, Australia, the UK, Spain, Mexico, and more.  A few commentators support pulling the dictionary.  Most, however, see nothing more than confirmation of their belief that the US is a very strange place.

Thursday, January 28, 2010

A Very Busy Censorship Day

Is there some unusual alignment of the planets this week?  Is general IQ in the US taking a sudden nosedive?  It's certainly a bizarre week for those of us monitoring censorship issues.  The Volusia County Council in Florida overreacts to an uptight library patron by trying to revoke every adult's right to demand that internet filters be disabled, as if the council never heard of the First Amendment. Deranged school administrators in Riverside County, California, actually pulled copies of Miriam-Webster's Collegiate Dictionary. That was only temporary, thank goodness, but is still pretty shocking (Dudes!  It's a DICTIONARY!).  And then . . . ?

The Tulsa World reported today that a children's book known as Buster's Sugartime has been challenged at the Union School District in Oklahoma.  The problem was that the book mentions (MENTIONS!) same-sex couples.  I'm not going to read the book, but I will quote here from the Tulsa World article, which contains quotes from the book:

Of the book's 31 pages of text and pictures, two short passages mention the same-sex couple: "Buster went to visit his mom's friends Karen and Gillian. They had three children ..." and "Lily's moms, Tracy and Gina, were very good cooks."
According to the article, the parents of one elementary school student formally challenged the book.  A review committee recommended 6-1 that the book be retained.  Undeterred, the parents took their complaint to the school board, arguing at least in part that the book advocates same-sex marriage, and such marriages are not allowed under Oklahoma law.  What that line of argument could possibly have to do with the book was not explained, at least not in the news article.  The school board voted 3-1 to retain the book, and that has ended the issue, at least for now.

I realize some parents will object to even so innocuous a text as this. But to challenge the book's availability, denying other parents the right to make a different choice, is harder to understand.  The review process seems to have worked in this case, but shame on the one member of the review committee and one member of the school board who voted to remove the book.

How Low Can Censorship Go?

You have to laugh, otherwise you'll cry. A January 27th story on MSNBC.com relates how a school district in Riverside County, California, actually pulled the Miriam-Webster's Collegiate Dictionary after a parent complained that it defined the term "oral sex." A review committee decided to return the dictionaries to use, but allowed parents to choose a different dictionary for their children!! I can just picture it, "oh, no, Johnny, you can't use the book all the other kids are using, here's the dictionary your mother said you can use."

About Internet Filters in Volusia County, FL

SafeLibraries was happy, in his post of 22 Jan, to proclaim that Volusia County (FL) officials had tightened up rules about disabling internet filters on computers within the Volusia County Public Library system. It's taken me some time to figure out in more detail what this is about. Having done that, I have to express my doubts about the new restrictions in that county. This will be up to the lawyers to decide (hopefully before it goes to a judge), but I suspect these new restrictions will not stand.

What happened, according to the News Journal Online (14 Jan 2010), is that one adult patron complained about what another adult patron was looking at online. It is not clear that any minors saw the offending material, nor is it clear whether or not the offending material was actually obscene or otherwise illegal. Eventually, the complaint led to the county council, with input from at least some library personal, to restrict the rights of adult patrons to unblock filtered sites or disable the internet filter on request.

According to a presentation made to the County Council, and available from their online agenda of the meeting of 21 Jan, the new policy requires that a patron wishing to unblock a site or disable a filter must file a request in writing. The form requires the patron to identify himself or herself by name and library card number, requires the patron to state a reason for the request, and allows the library up to 72 hours to respond.

One of the things that this presentation makes clear is that the library accepts federal funds to help defray telecommunications and internet access charges, and is therefore subject to the terms of the Children's Internet Protection Act (CIPA). I'm no lawyer, but I believe that establishes a legal framework that makes the new policy highly questionable. My (non-expert) concerns are:
  1. The new policy defines an adult as someone 18 or older, but the CIPA states a cutoff age of 17.
  2. The new policy requires an adult to state a reason for disabling the internet filter, but the US v. ALA Supreme Court decision (echoed in FCC Order 03-188), appears to require that an adult be able to disable a filter without stating a reason.
  3. The new policy creates a privacy concern by requiring an adult requesting unblocking or disabling to identify himself or herself by name, in writing, and to state the planned internet access. The CIPA appears not to permit this kind of record keeping.
  4. The allowance of 72 hours for consideration of the request is far too long. The US v. ALA decision assumed that disabling could be done in a timely manner. Excessive delays may place an undue burden the right to access protected speech.
It will be very interesting and educational to watch how this plays out. (More detailed information and additional links are available HERE.)

Wednesday, January 27, 2010

Can the CIPA be fixed?

The Children's Internet Protection Act (CIPA) is a rather badly written law. I don't say that just because I disapprove of it. I say that because it is vague and impractical in places, and creates an administrative headache for many while accomplishing little.

The question is, can it be fixed? And in a word, the answer is NO.

Of course, there are many conflicting ideas of what it would mean to "fix" the CIPA. Many would like to make it more restrictive, while others want it done away with altogether. My personal opinion is that some kind of filters for computer users who are minors is inevitable, so my list of fixes is somewhere in between: I just want the law to be more practical and manageable.

Why can't it be fixed? That comes down to the fact that the CIPA is just barely legal. It must be kept in mind that the CIPA was found unconstitutional by a US District Court, and that in overruling that decision and upholding the CIPA, the US Supreme Court itself was badly fragmented. Three out of the nine justices that made that decision in 2003 have since been replaced, and presumably all of the justices now have a better understanding of what the internet really is. It is far from clear how the Supreme Court would rule today if presented with the same, or a closely related, issue.

Given the controversial nature of the Act, any significant change in its scope will be tested quite vigorously in court. For this reason, legislators tend to shy away from making anything other than very minor adjustments. Some tweaks have been discussed, but nothing has come of them. In 2006, for example, a bill was proposed to require internet filters (in libraries complying with the CIPA) to restrict access to potentially dangerous chat rooms. This bill never made it into law, and in part that might just be due to which parties controlled which committees and when. But also, legislators feared that their changes might not pass judicial scrutiny, and worse, could get the entire CIPA reviewed by a court that might not be as accepting as in the past.

In spite of this rather sizable difficulty, I think there are some things that could be done to improve the CIPA:
  • Clarify that libraries must unblock or disable a filter on demand by any adult patron without delay or question. At a practical level, this is how things work now, because the US v. ALA (2003) decision imposed this requirement. But the language of the Act itself has not been updated to reflect that decision.  The Act still says that libraries may unblock or disable on demand by an adult patron, and censorship proponents try to make far too much out of the difference between may and must. Changing that one word would have little or no practical impact, but would cut down on a lot of rhetorical nonsense.

  • The FCC could get clearer on what kinds of violations of CIPA terms will cause a school or library to loose the related federal funds and what kinds will not. The law itself being rather vague, the FCC is in a difficult position, and one result is that they sometimes avoid getting specific. Will the FCC actively pursue a return of funds if a library has computers that have no filters installed, even if those computers can only be used by adults? How about if the library refuses to unblock or disable on demand by an adult patron? What if the library tries to create an application and review process for unblocking or disabling, or just takes too long to unblock or disable on demand? Each of these situations appears to violate the terms of the CIPA, but which the FCC will take action on is less than clear.

  • Some kind of standard measure for the effectiveness of Internet Filters, in terms of both underblocking and overblocking, is needed. The law requires that CIPA-compliant libraries have internet filters, but says nothing about how to measure their effectiveness or what effectiveness levels are minimally required. On the one hand, it is clear that an internet filter program could be so ineffective that a library using it would be out of compliance with the CIPA. On the other hand, it was well known when the CIPA was written that all filtering programs both overblock and underblock content, so some degree of ineffectiveness must be acceptable. Between these two extremes, nobody knows where the line is to be drawn, or even how to go about drawing it. It may someday be up to some unfortunate library with poorly designed software (deliberately or accidentally) to test this in court.

  • The CIPA probably needs to include language that requires age-grading of access to materials, allowing a 16-year-old (for example) to access material that might be blocked for a 6-year-old. Some filtering software provides this capability, but not all. The CIPA makes no statement about requiring such grading, lumping all minors under the age of 17 into a single category. This is a difficult area of Free Speech law, but court precedent indicates that a 6-year-old and a 16-year-old are not the same with regard to what kinds of images might be too sexually explicit. Failing to address this in the CIPA leaves libraries open to "as applied" challenges from all sides.
If we must deal with something like the Children's Internet Protection Act, at least it could be well written enough to clarify practical requirements. Legislators are, after all, supposed to write law that works, meaning that a reasonable person can interpret what is and is not in compliance, and the probability of drawn-out court battles is minimal. That's what we pay them to do.

Monday, January 25, 2010

Open Letter to the Volusia County Council

Dear Councilperson:

It appears that the Council is considering changes in policy about unblocking and/or disabling internet filters at the Volusia County Public Library. While my understanding of that proposal is limited to what I’ve read in news reports, and is therefore incomplete, I wish to bring to your attention the possibility that the proposed changes could result in the revocation of federal funds the library currently receives, as well as embroil the county in a costly lawsuit. I urge the council to seek the advice of an attorney before proceeding with the proposal.

If I understand the situation correctly, the Volusia County Public Library receives certain federal funds that make the library subject to the terms of the Children’s Internet Protection Act (CIPA). The CIPA requires that libraries receiving those funds have filtering software on all computers with internet access. You should be aware, however, that the Federal Communications Commission (FCC), which administers certification and compliance under the CIPA, requires that adult library patrons be able to disable the internet filter without explaining their intended use. The FCC clarified this in order 03-188, in response to the US Supreme Court decision US v. ALA. The order says, in part:

The Supreme Court found that CIPA does not induce libraries to violate the Constitution because public libraries’ Internet filtering software can be disabled at the request of any adult user and, therefore, does not violate their patrons’ First Amendment rights. In upholding CIPA, the Supreme Court emphasized “the ease with which patrons may have the filtering software disabled,” and that a patron who encounters a blocked site … need only ask a librarian to unblock it (or at least in the case of adults) disable the filter.” The plurality also highlighted the government’s acknowledgment at oral argument that “a patron would not ‘have to explain … why he was asking a site to be unblocked or the filtering to be disabled.’”

I am given to understand that the library can develop a policy about internet use that reduces the discomfort some patrons might feel in reaction to what other patrons are viewing on a library computer. However, inserting library or other county staff into the process of deciding which adults get to view which internet sites is, in addition to being extremely impractical, fraught the legal and financial implications for the county.

Again, please seek the advice of an attorney before proceeding with the proposed policy changes.

Thank You For Your Consideration.

News Coverage: http://www.wesh.com/news/22308295/detail.html

FCC Summary of the CIPA: http://www.fcc.gov/cgb/consumerfacts/cipa.html

FCC Order 03-188: http://hraunfoss.fcc.gov/edocs_public/attachmatch/FCC-03-188A1.pdf

US v. ALA Decision: http://caselaw.lp.findlaw.com/scripts/getcase.pl?court=US&vol=539&invol=194

Friday, January 22, 2010

A Response To Ms. Maziarka

I was honored and surprised to find that Ginny Maziarka actually paid any attention at all to my blog on hers, a rare event indeed. But as usual, her thinking is a hopeless muddle. She seems to think I'm using a quote of Supreme Court Justice Potter Stewart to support my position against her brand of censorship in public libraries. What she's doing there is creating a strawman: she claims I said something I never said, attacks what I never said, and then claims victory, hoping nobody is paying any attention at all.

My quotation of Justice Stewart was posted on my blog in isolation, without any comments of my own, offered as an indication of judicial attitudes toward censorship in general. It is quite a poignant quote that way, but I have not used it as evidence or support for anything else. As to my arguments about removing, re-shelving, or labeling books, or otherwise restricting access to books in public libraries, they are all carefully constructed from many other legal sources, all of which she ignores, and all of which are far beyond Ms. Maziarka's limited comprehension of Free Speech law.

Her complaint about my post would make at least a minimum of sense if the books she objected to in West Bend could legally by restricted by ANY law ANYWHERE. But they can't. The books she objected to, like Geography Club, Perks of Being a Wallflower, Heather Has Two Mommies, and (many) others, may annoy some, but come nowhere close to meeting any legal definition of obscene, pornographic, or harmful to minors. Ms. Maziarka has every right to prevent her own children from reading those books, but her attempt to take away the rights of other parents to make a different decision about that is a flat out violation of the First Amendment, censorship as the term has been used in any number of court cases.

It doesn't bother me that she disagrees with me. But I am appalled by her resolute unwillingness to learn. Her attempts at censorship in West Bend were an abject failure. A lot of time and energy went in to analyzing and responding to her complaint, and the books are still on the same shelves, without warning stickers, and without restrictions. She failed because what she was insisting that the library do was break the law. She can complain about my quotations all she wants, and she continue to try to convince would-be censors in other towns that they have the same dictatorial authority over library contents that she tried to assert for herself. But as long as censorship opponents are organized enough to get lawyers and judges involved, she will continue to accomplish nothing.

Thursday, January 21, 2010

Howl On Trial

While hardly a "beatnik" myself, I've always had an appreciation for the Beat Generation writers of the 1950s, especially Burroughs, Ginsberg, and Kerouac. These were the artists and intellectuals who started the youth and counter-culture movements that would become pop-culture (neither intellectual nor artistic) in the 1960s and 70s.

One crystallization of the Beat aesthetic was Allen Ginsberg's short poem Howl, an iconoclastic, counter-cultural canticle, considered a masterpiece by many literary pundits, and panned as trash by defenders of the status quo.

So I read with interest Howl on Trial: The Battle for Free Expression, published by City Lights Books as a 50th anniversary retrospective. The book includes the text of Ginsberg's famous poem, letters about the poem that Ginsberg exchanged with friends, family, writers, and his publisher, excerpts from and commentary on the 1957 obscenity trial, and reactions to the trial by the press and public. The collection of op-ed pieces and letters to the editor of the San Francisco Chronicle is especially interesting, some of it eerily similar to opinions on censorship expressed in 2009.

Of course there had to be a trial. Howl mentions sex with indelicacy. It raises the profane to the level of the sacred. It glorifies inconformity. And it was the 1950s. Obscenity laws were expansive and allowed much more censorship than courts would permit today. In fact the Roth v. United States case, the beginning of today's legal understanding of obscenity, was decided in 1957, amid the court proceedings against Howl.

The censorship carried out against Howl would be almost unimaginable today. This was a case of prior restraint by government, an act of censorship by anybody's definition. The first printing had created a literary stir, and customs agents seized part of the second printing, which had been shipped from the UK to San Francisco. While the Customs office dithered, public debate about the seizure raised demand. Lawrence Ferlinghetti and City Lights Books had a third printing done, this time within the US, specifically to keep the booklet out of Customs jurisdiction. The US Attorney in San Francisco refused to proceed against the book, and the overzealous Customs office was forced to release the copies they had seized.

In spite of the public ridicule heaped on the Customs office, the San Francisco Police department then undertook their own suppression of the work. They sent undercover (!) officers to City Lights Bookstore to purchase a copy, and on completing the transaction, they arrested the store clerk and issued a warrant for Ferlinghetti. At the trial in a San Francisco Municipal court, the Judge dismissed the charges against the clerk, but the trial continued against Ferlinghetti as the publisher. The court found, of course, that Howl had serious value and therefore could not be classified as obscene. The controversy over the attempted suppression of the work guaranteed its fame, and City Lights immediately ordered a fourth printing of 5,000 copies.

While quite different situations, there are surprising similarities between censorship in 1957 and censorship in 2010. As the 1957 letters to the editor make clear, there are always some citizens who support the censorship of "smut," even if they've never read the supposedly smutty book and have no idea what it actually contains. More importantly, it is clear that in any era those most likely to practice censorship are also those who are least qualified to evaluate the value of literature or any other art. Censorship is, by its very nature, the refuge of the undiscerning.

Wednesday, January 20, 2010

A Society's Confidence in Itself?

"Censorship reflects a society's lack of confidence in itself. It is a hallmark of an authoritarian regime. Long ago those who wrote our First Amendment charted a different course. They believed a society can be truly strong only when it is truly free. In the realm of expression they put their faith, for better or for worse, in the enlightened choice of the people, free from the interference of a policeman's intrusive thumb or a judge's heavy hand. So it is that the Constitution protects coarse expression as well as refined, and vulgarity no less than elegance. A book worthless to me may convey something of value to my neighbor. In the free society to which our Constitution has committed us, it is for each to choose for himself."
US Supreme Court Justice Potter Stewart
In his dissenting opinion, Ginzburg v. United States
383 US 463 (1965)

Sunday, January 17, 2010

The Destruction of Knowledge

Sometimes it's important to step back and look at the big picture. While a topic like censorship in libraries is expansive enough, it is not an isolated phenomenon. It takes place within broader social trends that I'll call "The Destruction of Knowledge." By destruction I mean a deliberate, planned effort to limit what others know, an intent to narrow the available range of information and opinion.. By knowledge I certainly mean factual information, but also social perspective, an understanding that human societies are diverse, that opinions vary. By knowledge I also mean an aesthetic appreciation, an understanding of human experience as expressed art and literature. Knowledge, after all, is not just a collection of facts.

The many(!) attempts in recent years to remove or limit access to books in schools and libraries is a serious problem. Such efforts fail far more often than they succeed, but few censors seem ever to learn that. They mount campaign after campaign, dividing communities, wasting the time of public officials, and exhausting activists on both sides, mostly in efforts that accomplish nothing. It is difficult to comprehend such relentless futility without taking note of the deep distrust that political and religious conservatives have of an informed public, their fear of knowledge itself.

A brief overview of anti-knowledge trends includes:
  • Further reducing already limited public spending on education at all levels.
  • Vouchers and other methods for shifting tax dollars out of public education and into private hands.
  • Sanitizing great literature into mind-numbing pap, as is too often done in the classroom with the writings of Walt Whitman, William Shakespeare, and other masters.  
  • Attempts to remove education about basic human sexuality from public school curricula, including removing information about abortion, birth control methods, and prevention of disease, or replacing medically sound curricula with abstinence-only "education."
  • Attempts to remove education about human evolution from public school curricula by watering down information on evolutionary theory, injecting unscientific doubt, or adding materials on the pseudoscience sometimes called Creationism or Intelligent Design.
  • Lawsuits and other pressure by private schools that teach Creationism to force accrediting agencies or universities to accept the notion that their students are adequately educated.
  • Pressuring school and public libraries to reduce or eliminate books that portray homosexuality in a positive, or even neutral, light, or that give factual information about variation in human sexual expression.
  • Pressuring school and public libraries to carry more books on pseudo-scientific "ex-gay" therapies.
  • Demanding that schools and libraries impose internet filters that go far beyond protecting children from sexually explicit material, blocking access to information that both adults and minors have a constitutional right to access.
  • Pressure to keep worldview-challenging curricula out of the public schools, including courses on comparative religion, comparative political systems, and many aspects of history.
The motivations behind these attempts to narrow the range of available knowledge also motivate the more mundane processes of:
  • Attempting to remove or restrict access to books in school curricula.
  • Attempting to remove or restrict access to books in school libraries.
  • Attempting to remove or restrict access to books in public libraries.
It is often in these mundane book challenges that the true political and religious motivations are most honestly revealed. Some book challengers are genuinely concerned with protecting children from obscene materials, but these are few. Too often, a book challenger makes claims about a book he or she has never read, and relies on the ignorance of others not to get caught. If they have read the book, they grossly exaggerate the objectionable aspects of it, either taking quotes out of context or outright lying about the contents (like calling The Bluest Eye the "bestiality book" or calling Heather Has Two Mommies "sexually explicit"). From time to time the challengers are more honest, clearly stating that they object to a book because they see it as un-American, anti-religious, challenging to authority, glorifying criminality, promoting homosexuality, or any of a long litany of complaints that are excruciatingly documented in endless letters to library directors, school boards, and city councils (and all of which mark a work as constitutionally protected speech).

Underlying it all is the inability or unwillingness of the censorious to live in a modern, pluralistic democracy. They are unwilling to acknowledge that one society can include multiple values and opinions. They are unable to understand that democracy implies that others have a right to make choices different from their own. They fail to grasp that the mechanisms of the state do not exist for the purpose of promoting a single worldview, and assume that the single worldview that should be promoted is their won. They want to force everyone else to live within the same narrow range of knowledge to which they restrict themselves.

It is only in this light that any sense at all can be made of the claims asserted by People Against Bad Books In Schools, Family Friendly Libraries, Know Your Library, SafeLibraries, PFOX, West Bend Citizens For Safe Libraries, and others. Their claims that what they're trying to do is anything other than censorship is rational only within a worldview that does not comprehend a difference between private choices and public policy. Their claim that opposing them undermines democracy itself can only be counted as sensible in the absence of a commitment to pluralism. And to promote those views in a pluralistic democracy that limits what is controlled by public policy in favor of private choices they must drag everyone else down to their level.  To make their logic work we must all become unlettered, become ignorant of history and doomed to repeat it, to know nothing that might undermine a cherished political position or article of religious faith.  

Sunday, January 10, 2010

What the CIPA does (not) Require

The rhetoric of censorship proponents makes it clear that they look with nearly religious awe upon Internet Filters and the CIPA, a law that requires some libraries to implement such filters.  They expect nothing less than miracles, and so can only be disappointed.  I tried to inject some reality into the understanding of Internet Filters in a series of five recent articles (starting HERE), and would now like to do the same for the CIPA (in just one article, albeit long).

The CIPA, or Children’s Internet Protection Act, is a US law that went into effect in 2001. It was quickly challenged in court on constitutional grounds, and at one point was overturned by a US District court as an unconstitutional infringement on Free Speech. On appeal, however, the US Supreme Court overruled the District Court, holding that the CIPA was constitutional (US v. ALA, 2003).  As a result, the CIPA became enforceable and remains in effect today.

The CIPA imposes a set of conditions on the acceptance of government funds under federal programs known as E-rate and LSTA, which are designed to help schools and libraries pay for telecommunications, especially internet access. Because this reduces the costs to schools and libraries, the benefit is called a discount.  In order to receive this discount, schools and libraries have to certify that they comply fully with the conditions imposed by the CIPA.

The CIPA requires the Federal Communications Commission (FCC) to administer the certification process.  The FCC must interpret the CIPA and come up with administrative rules that inform schools and libraries of the requirements and define the process of certification.  If it happens that a library certifies it is in compliance but then the FCC finds that the library is has not met all the conditions, the FCC might demand reimbursement of the discount.  

Some of the conditions required by the CIPA are:
  1. Schools and libraries must implement Internet Filtering programs that block online access to visual images that contain child pornography or obscenity.

  2. For users who are minors under 17, the Internet Filtering programs must also block online access to visual images that are Harmful to Minors.

  3. Schools and libraries must have an “Internet safety policy” that addresses “the safety and security of minors when using electronic mail, chat rooms, and other forms of direct electronic communications.”
Perhaps more interesting than what the CIPA requires are some of the things the CIPA does NOT require.  Here is a partial list:
  1. The CIPA does not require all schools and libraries to comply with its terms.  Only libraries that accept E-rate and LSTA funds to reduce their expenses for telecommunications are bound by the terms of the CIPA.  Libraries that don’t participate in those programs are free to ignore the CIPA.  Complying with the terms of the CIPA costs money, and each library must evaluate the costs and benefits based on its own financial circumstances before deciding whether or not to participate.

  2. The CIPA does not require Internet Filters that block access to any text.  This surprises many, in part because blocking objectionable text is a feature of many Internet Filtering programs.  A library might happen to implement a filter that blocks text as well as images, but that is not a requirement.  What the CIPA requires is a filter that blocks access to visual images that are a) child pornography or b) obscene, and for computer users who are minors, c) are harmful to minors.  So far as text is concerned, the CIPA allows an 8-year-old to read the writings of the Marquis de Sade

  3. The CIPA does not require that adult users be blocked from viewing material that is harmful to minors.  Such material must be blocked from access only by users who are themselves minors.  Adults need be blocked only from material that meets the legal definitions of obscenity or child pornography.

  4. The CIPA does not require Internet Filters that block access to Email messages, Instant Messages, or Chat Rooms. Some filtering programs have an ability to monitor these forms of communication while others do not, but no such blocking or monitoring is required by the CIPA.  What is required is a policy, not an Internet Filter, that addresses the safety of Email, chat rooms, and the like.  

  5. The CIPA does not require that Internet Filter programs meet any specific standards of effectiveness.  Although it is well understood that all Internet Filtering products both underblock and overblock access to internet materials, and that the degree to which these errors happen varies from program to program, there is no standard or process for certifying which commercially available filtering products comply or fail to comply with the intent of the CIPA.

  6. The CIPA does not require that Internet Filters remain in place for adult users.  The CIPA requires filters to be installed on all library-owned computers, but allows an adult user to request that a specific site be unblocked or the entire filter be disabled "for lawful purposes." 

  7. The CIPA does not require that Internet Filters be installed on patron-owned computers.  Increasingly, public libraries allow patrons to connect their personal computers to the internet through the library's communications network.  This is allowed even if the patron's computer has no filtering program installed.  Some libraries may have a filter that resides partially or completely on the network itself, rather than on each individual computer, and in that situation the patron's computer will be subject to some degree of filtering.  Other libraries may be using a filter that resides separately on each individual computer, and in this scenario the patron's computer would not be subject to any filtering.  The CIPA leaves it up to each library to choose the structure of the filters it installs, so there is no requirement one way or the other.
In addition to what the CIPA requires or does not require, there are two restrictions that come from outside the Act itself.  These are:
  1. A library can establish its own policy as to whether or not the intervention of library staff is required to unblock a site or disable the filter on a computer being used by an adult patron. Some libraries configure the filtering system to allow adult patrons to unblock the filter themselves, without staff intervention.  Other libraries allow only library employees to perform those functions, requiring adult patrons to request staff help if they want to unblock or disable the filter.  Legal experts differ in their opinions as to what the exact requirements of the CIPA are in this regard.  In its administrative orders, the FCC acknowledges that this debate exists, and declines to issue a clarifying ruling.  That leaves the matter up to each library.

  2. Adult patrons do not have to state a reason for requesting unblocking of a site or disabling of the filter, and the unblocking or disabling must be done without significant delay.  This was indicated by the Supreme Court in the US v. ALA decision that allowed enforcement of the CIPA, which decision was echoed by the FCC in its administrative order of 2003 (below).   
It is no doubt surprising to many to learn what the CIPA actually requires and does not require. Considerable confusion arises when individuals learn that CIPA requires Internet Filters and then jump to the mistaken conclusion that the filters required by the CIPA in schools and libraries must block the same material as is typically blocked by an Internet Filter installed in a private home.  Internet Filtering programs are designed to serve multiple markets, of which schools and libraries are just one of many. The result is that many Internet Filtering programs have features and functions simply not required by the CIPA.

This is, of course, my own (non-expert) analysis, based on orders and summaries published by the FCC and other sources.  Here are the links to those sources:



I close with some excerpts from FCC orders of 2001 and 2003:

From FCC 01-120:

In order to receive discounts for Internet access and internal connections services under the universal service support mechanism, school and library authorities must certify that they are enforcing a policy of Internet safety that includes measures to block or filter Internet access for both minors and adults to certain visual depictions. These include visual depictions that are (1) obscene, or (2) child pornography, or, with respect to use of computers with Internet access by minors, (3) harmful to minors. An authorized person may disable the blocking or filtering measure during any use by an adult to enable access for bona fide research or other lawful purpose.

From FCC 01-120:

In order to receive discounts, school and library authorities must also certify that they have adopted and implemented an Internet safety policy addressing (i) access by minors to inappropriate matter on the Internet and World Wide Web; (ii) the safety and security of minors when using electronic mail, chat rooms, and other forms of direct electronic communications; (iii) unauthorized access, including so-called “hacking,” and other unlawful activities by minors online; (iv) unauthorized disclosure, use, and dissemination of personal information regarding minors; and (v) measures designed to restrict minors’ access to materials harmful to minors.

From FCC 01-120:

Section 254(h)(5)(D) and (6)(D) permits a school or library administrator, supervisor, or other person authorized by the certifying authority, to disable an entity’s technology protection measure in order to allow bona fide research or other lawful use by an adult. A number of commenters, particularly libraries, express concern that each time an adult user requests that the blocking or filtering software be disabled pursuant to these provisions, school or library staff would be required to make a determination that the user was engaging only in bona fide research or other lawful purposes, and staff would then be required to disable the technology protection measure. Many commenters caution that staff would be unable to satisfactorily make such determinations, and that the requirement would render moot existing policies, have a chilling effect on adults’ Internet use, and significantly impinge on staff time and resources. We decline to promulgate rules mandating how entities should implement these provisions. Federally-imposed rules directing school and library staff when to disable technology protection measures would likely be overbroad and imprecise, potentially chilling speech, or otherwise confusing schools and libraries about the requirements of the statute. We leave such determinations to the local communities, whom we believe to be most knowledgeable about the varying circumstances of schools or libraries within those communities.

From FCC 03-188 (elipses in the original):

The Supreme Court found that CIPA does not induce libraries to violate the Constitution because public libraries’ Internet filtering software can be disabled at the request of any adult user and, therefore, does not violate their patrons’ First Amendment rights. In upholding CIPA, the Supreme Court emphasized “the ease with which patrons may have the filtering software disabled,” and that a patron who encounters a blocked site … need only ask a librarian to unblock it (or at least in the case of adults) disable the filter.” The plurality also highlighted the government’s acknowledgment at oral argument that “a patron would not ‘have to explain … why he was asking a site to be unblocked or the filtering to be disabled.’”

Saturday, January 9, 2010

Internet Filters: The Constitutional Headache

(Last in a series of five articles on Internet Filters)

While Internet Filters, in spite of the imperfections documented in this series of articles, can be effective at preventing minors from accessing sexually explicit materials, they can also create a constitutional headache.  Specifically, they can create situations in which the Free Speech provisions of the First Amendment are violated.  These problems can arise because of differences in 1) WHERE a filter is implemented and 2) WHAT a filter blocks. 

The problem of WHERE a filter is implemented is relatively straight forward.  Many filters are designed for use in a private setting, either a home or a corporation, in which Free Speech issues are minimal.  A parent has a legal right to block the internet content of which he or she disapproves. In a corporate workplace Free Speech concerns are a little more significant, but the company still has broad authority to control its employees’ use of the internet, at least where the company owns the computers and internet connection being used.  Many Internet Filtering programs, then, are designed for environments where private decisions can legally take precedence over Free Speech concerns. 

The problem of WHAT filters block is a little more subtle, but still critically important.  The problem, if you want to call it that, is that the manufacturers of filtering programs design their products to satisfy their customers, who are mostly parents and business managers, not librarians or lawyers.  Filters tend, then, to block the kinds of things parents and managers want blocked, not just what the law says can be blocked.  In general terms, what the law says can be blocked without creating First Amendment entanglements is:

Child pornography – images of real minors depicting them in a sexual or sexualized manner.

Obscenity – depictions or descriptions of bodies or bodily functions that are, from the standpoint of community standards, prurient and patently offensive, and are, by the standards of a reasonable person, without serious value.  Within limits, the exact definition of obscenity can vary from state to state and locality to locality.

Harmful to Minors – parallels the definition of obscenity, but is evaluated by the standards that apply to minors.  This is material that is too racy for children but not too racy for adults.  For adults, material that is Harmful to Minors is protected speech, and interfering with an adult’s access to such materials may violate the First Amendment.  It is generally legal, though, to block a minor’s access to such materials.  Since the exact definition of obscenity varies from state to state and locality to locality, so does the category of Harmful to Minors, at least in theory.  There is an exception, however, and that is that the Children’s Internet Protection Act (CIPA) contains a nation-wide definition that applies to filtering requirements in all libraries that are bound by the Act (i.e., that accept the government funds that bind them to the Act).

Depending on the product used and the configuration settings chosen by the purchaser, Internet Filters often block a great deal more than these three categories of expressive materials.  Frequently blocked expressive materials include those containing foul language, depicting drug use, glorifying criminality, demeaning a particular religion or religion in general, sex education materials, and information on birth control or abortion.  None of these materials fall within the categories of child pornography, obscenity, or harmful to minors.

Note that this problem of WHAT a filter blocks is not just a question of overblocking.  The reader might recall that overblocking is essentially a programming failure, an inability of the software to make a blocking decision the same way a trained human being would.  That is part of the problem of WHAT filters block, but not all of it.  What’s been described above is overzealous blocking that also happens by deliberate design, when the program is working as intended.

The problems of WHERE filters are implemented and WHAT they block interact with each other.  In a private setting it matters little what the filter blocks, as long as the purchaser of the filter is satisfied.  In other words, the WHERE makes the WHAT irrelevant.  In a public setting, however, the WHERE makes the WHAT absolutely critical.  In a public library above all, legal precedents have established that protections of Free Speech are quite strict.  In a public library the purchaser of the filtering program is a government agency, which has a constitutionally limited authority to dictate to private citizens what they may read or look at.  In a public setting, blocking needs to be more carefully tailored to filter out only materials that are Child Pornography, Obscenity, and material that is Harmful to Minors (if accessed by a minor).  Blocking access to more than this – for either adults or minors – may violate the First Amendment and may result in a Free Speech lawsuit against the library and its city or county.

Again, none of this is to claim that Internet Filters don’t work at all or shouldn’t be employed.  The purpose of this series of articles has simply been to raise awareness of some of the real-world limitations of Internet Filters.  Filters don’t control all the means of communication a computer user has, are sometimes designed to restrict access more broadly than the law requires or allows in the context of a public library, and are known to fail to meet their own design objectives some of the time.  It is especially important for censorship proponents to learn that Internet Filters are NOT magic.  Even with the best filtering products available, some library patrons are going to be able to gain access to pornographic materials, and minors may not be protected from all the means of communication an internet predator might use.  Filters could possibly be made more restrictive, but this will accomplish nothing if a court orders the filters deactivated because they infringed excessively on someone's Free Speech rights.  They keys to accomplishing a modicum of protection with Internet Filters are balance and realistic expectations.


Previous articles in this series:
            What are Internet Filters?


Friday, January 8, 2010

CIPA Compliance in Brownsville, Texas?

In his January 6th post, SafeLibraries takes to task the Brownsville Public Library, which he accuses of falsely claiming to be in compliance with the Children's Internet Protection Act (CIPA).

I am in no position to verify the details of the situation in Brownsville, but if they are as SafeLibraries claims, he could have a valid point. That is, IF the Brownsville Public Library has certified to the FCC that it is in compliance with the CIPA, and  IF it is also true that the library has computers that are permanently unfiltered (as SafeLibraries indicates), then the library might actually be violating the terms of the CIPA, and could be subject to an FCC demand for return of the federal funds it has received from programs related to the CIPA.

So far as I understand the requirements of the CIPA, a compliant library must have internet filters installed on every computer, even computers in a back office that is not open to the public.  While any adult patron (or employee) can request that the filter be disabled, the filter would have to be re-enabled once that individual finishes using the computer.

Part of SafeLibraries' post is about an adult patron who was able to download child pornography on a library computer (the patron was convicted and jailed, according to news reports).  It is important to point out that such an incident, by itself, is not evidence that a library is out of compliance with the CIPA, and can be difficult to evaluate without a lot more details about how the illegal pornography was obtained.  In a library that is complying fully with the CIPA, a patron might still be able to download illegal pornography because a) the filtering program underblocked, meaning that it failed to live up to its intended design, b) the pornography was transfered as an attachment to an email or instant message that was not monitored by the filter or could not be identified as containing illegal images, c) the patron simply had requested that the filter be disabled while he was using the computer.  Downloading child pornography is a crime regardless of how it is accomplished.  But even in a library that is fully CIPA-compliant, it is possible for such a crime to take place.

A recent post by the American Library Association documents a somewhat similar situation at the public library in Groton, Connecticut.  Police are investigating the possibility that an adult patron downloaded child pornography on a computer in the library, in spite of internet filtering.  In Groton, however, there has yet to be any accusations that the library failed to comply in any way with the CIPA.

Thursday, January 7, 2010

Internet Filters and Email, Chats, and Attachments

(Fourth in a series of five articles on Internet Filters)

In spite of the problems of overblocking and underblocking, Internet Filters can be reasonably effective at preventing minors from accessing sexually explicit websites.  It is important to recognize, however, that computer users often do more than just surf the web.  Computer users also communicate with each other by a variety of means that are not as well controlled by filtering programs.  The most important of these are:

Email.  Users can exchange electronic mail messages with each other using a variety of programs.  Most email systems allow files to be attached to messages, including files that contain photographic images.

Instant Messaging (IM).  While email is asynchronous, meaning that the sender and receiver don’t have to be connected at the same time, IM (online chatting) takes place in real time: the sender and receiver have to be logged on at the same time.  Increasingly, the various programs that provide IM abilities allow file attachments, meaning that files containing photographic images can be transfer from one user to another this way.

File Transfer Services (Uploads and Downloads). There are websites that exist solely for the purpose of allowing a computer user to store one or more files online, with the option of keeping those files private or making them available to selected users or to any user.  Any user knowing the web address of an accessible file or list of accessible files can download (copy) the files to her or his own computer.

Different Internet Filtering programs handle these kinds of computer-based communications differently.  They’re all limited to the same kind of Black List, White List, and Text Pattern checking used to decide whether or not to allow access to basic websites, but the ad hoc nature of these other forms of communication makes the filter program less effective.  It is clear, then, that users can sometimes transfer pornographic images directly through the filtering programs without the program ever noticing a problem.  As long as there are no clues in plain text – in a message, in the name of the file being transferred, or in  text tags that might be embedded within the file -- the file will be passed through to the user.  Remember that without the clues in plain text the Filtering Program has no way of knowing what an image file might be about.

In a library setting there is often another – much larger – gap in what Internet Filters can control: the patron-owned computer.  Increasingly, public libraries provide free network connections, allowing library patrons to bring their own laptop computers from home to connect to the internet while at the library.  In some libraries, patrons can use an Ethernet cable to plug in to the network.  Increasingly, libraries provide wireless networking, allowing patron-owned computers to connect just be being inside the library building.  The gap, of course, is that there may be no filtering program at all on the patron-owned computer, so the patron can use her or his computer to access any internet materials whatsoever.

The point of all this is not to claim that Internet Filters don’t work.  Rather, the point is to inject some realism into a subject that is often treated with a great deal of wishful thinking and at times outright fantasy.  Internet Filters can be reasonably effective at preventing children from accessing sexually explicit materials available on websites.  They are significantly less effective at controlling communication that takes place by Email, Instant Messaging, or file transfers.  A patron-owned computer may have no Internet Filter installed at all, and can be used to display any internet content the user wishes to display.

Previous articles in this series:
            What are Internet Filters?

Future article in this series:
Internet Filters: The Constitutional Headache 

Wednesday, January 6, 2010

Boy Meets Boy


Boy Meets Boy is an LGBT-themed young adult novel written by David Levithan. It stands out from other novels in its genre because rather than dealing with the difficulties of being an LGBT teen, it's set in a "gaytopia". It's set in a town where the Gay-Straight Alliance has more members than the PTA, the homecoming queen is also the star quarterback, and it's truly okay to be who you want to be. The conflict in the story arises from mixed signals, causing the main character Paul to be caught in a love triangle while his friend Joni is dating a guy he thinks is bad news. Boy Meets Boy is a well-written and wonderful little story about love and tolerance. I think it would be extremely beneficial and comforting to teens struggling with their sexuality and looking for a little hope.

The book is rarely challenged because there simply is no objectionable content aside from LGBT themes without condemnation. There is no coarse language, sexual content, drug use, or anything else that tends to cause controversy. There is kissing, but it is Disney-esque and completely G-rated. There is one mild sexual reference: Paul is visiting a friend of his who is gay and his parents disapprove. The friend's mother keeps coming into the room and Paul privately wonders if she thinks he'll begin "ravishing her son on the table if she doesn't come in the kitchen every ten minutes for a glass of water." That's it. David Levithan mentioned that he specifically wrote it to be as clean as possible. The only reason Boy Meets Boy would be challenged is because of the oh-so-scary "homosexual themes."

In spite of its G rating, challenges to Boy Meet Boy crop up.  It was challenged in 2009 at the West Bend (Wisconsin) public library, after the library put in on a gay-themed reading list.  The Oklahoma Library Association noted a challenge in 2003. Neither of these direct challenges was successful, although an article in Random House Inc. magazine indicates that the book is stolen from libraries with some frequency.

[This article was written in collaboration with Meghan of tolerance-megitty.blogspot.com]

Tuesday, January 5, 2010

Can Internet Filters Identify Obscene Images?

(Third in a series of five articles on Internet Filters)

A common misunderstanding about Internet Filters is the belief that such programs examine photographs or similar computer images and decide whether the content of the image is pornographic.  This is not a capability that filtering programs have, nor is it reasonable to expect that they can.

At a technical level, computers think quite differently from the way humans think.  Recognizing patterns, even imperfect ones, is easy for human minds, but extremely difficult for computers. A person, for example, can read the wildly varying handwriting of many others, while computers can decipher very few of these. Humans can understand spoken language with many different accents, while computers are easily confused by even slight differences in pronunciation (think about the last time you spoke your account number into one of those automated telephone banking or airline reservation systems).


Character Recognition is a good example.  Most of has have seen an image like the one depicted on the right when creating some kind of online account.  The whole point of this kind of image is to make sure that it’s a human being, and not a computer program, that is creating the account.  This is effective because the distortion of the letters makes it almost impossible for a computer program to recognize them, even though humans can usually identify the letters quite easily. 


While recognizing letters is more complicated than most people realize, it is vastly simpler than identifying the thematic contents of an image.  Consider the picture on the left.  Try to image how difficult it is for a computer – challenged by recognizing just letters – to determine what is going on in this picture.  Are there human bodies or body parts in the picture?  What are they doing?  Is it pornographic?  Such questions are probably beyond any computer program.

Some confusion arises because Internet Filtering programs sometimes do make choices about whether to allow or prohibit access to image files.  In most cases, though, the filtering program is making this choice on the basis of text, not the image contents of a file.  For one thing, the program can look at the text surrounding a link on a web page, and, assuming that the text gives some idea of the contents of the image to which the link leads, can prohibit access if that text contains tabooed terms.  The name of the file itself is also text that can be checked for tabooed terms.  In addition, the image file may contain text that is hidden from most viewers.  Depending on the format of the image file (gif, jpeg, png, etc.), there may be “tags” inside the file, text that describes the file contents.  These tags are not visible when displaying the image in the file, but they are present behind the scenes, and the filtering program can find them and check them for tabooed terms.

Beyond the technical issues lies a much more important and entirely human one: people can’t agree on a definition of pornography.  A precise legal definition has evaded lawyers and judges for decades, and today the determination is left up to juries using “community standards” as to what is prurient or patently offensive, and a “reasonable person’s” definition as to whether a work has “serious value.” Such vagueness is something humans may be able to grapple with, but it is quite outside the pale of computational logic. 

Progress is being made in the field of Artificial Intelligence, but we’re not there yet.  For the foreseeable future, computers will have to be told in great detail what to do and how to do it, and that makes it impossible for them to accomplish the nearly instantaneous pattern-recognition that is natural for the human brain. For now, Internet Filters will have to rely on Black Lists and White Lists determined by human review of websites, and on recognizing keywords that indicate possibly objectionable content. 

Previous articles in this series:
            What are Internet Filters?

Future articles in this series:
Internet Filters and Email, Chats, and Attachments.
Internet Filters: The Constitutional Headache