While Internet Filters, in spite of the imperfections documented in this series of articles, can be effective at preventing minors from accessing sexually explicit materials, they can also create a constitutional headache. Specifically, they can create situations in which the Free Speech provisions of the First Amendment are violated. These problems can arise because of differences in 1) WHERE a filter is implemented and 2) WHAT a filter blocks.
The problem of WHERE a filter is implemented is relatively straight forward. Many filters are designed for use in a private setting, either a home or a corporation, in which Free Speech issues are minimal. A parent has a legal right to block the internet content of which he or she disapproves. In a corporate workplace Free Speech concerns are a little more significant, but the company still has broad authority to control its employees’ use of the internet, at least where the company owns the computers and internet connection being used. Many Internet Filtering programs, then, are designed for environments where private decisions can legally take precedence over Free Speech concerns.
The problem of WHAT filters block is a little more subtle, but still critically important. The problem, if you want to call it that, is that the manufacturers of filtering programs design their products to satisfy their customers, who are mostly parents and business managers, not librarians or lawyers. Filters tend, then, to block the kinds of things parents and managers want blocked, not just what the law says can be blocked. In general terms, what the law says can be blocked without creating First Amendment entanglements is:
Child pornography – images of real minors depicting them in a sexual or sexualized manner.
Obscenity – depictions or descriptions of bodies or bodily functions that are, from the standpoint of community standards, prurient and patently offensive, and are, by the standards of a reasonable person, without serious value. Within limits, the exact definition of obscenity can vary from state to state and locality to locality.
Harmful to Minors – parallels the definition of obscenity, but is evaluated by the standards that apply to minors. This is material that is too racy for children but not too racy for adults. For adults, material that is Harmful to Minors is protected speech, and interfering with an adult’s access to such materials may violate the First Amendment. It is generally legal, though, to block a minor’s access to such materials. Since the exact definition of obscenity varies from state to state and locality to locality, so does the category of Harmful to Minors, at least in theory. There is an exception, however, and that is that the Children’s Internet Protection Act (CIPA) contains a nation-wide definition that applies to filtering requirements in all libraries that are bound by the Act (i.e., that accept the government funds that bind them to the Act).
Depending on the product used and the configuration settings chosen by the purchaser, Internet Filters often block a great deal more than these three categories of expressive materials. Frequently blocked expressive materials include those containing foul language, depicting drug use, glorifying criminality, demeaning a particular religion or religion in general, sex education materials, and information on birth control or abortion. None of these materials fall within the categories of child pornography, obscenity, or harmful to minors.
Note that this problem of WHAT a filter blocks is not just a question of overblocking. The reader might recall that overblocking is essentially a programming failure, an inability of the software to make a blocking decision the same way a trained human being would. That is part of the problem of WHAT filters block, but not all of it. What’s been described above is overzealous blocking that also happens by deliberate design, when the program is working as intended.
The problems of WHERE filters are implemented and WHAT they block interact with each other. In a private setting it matters little what the filter blocks, as long as the purchaser of the filter is satisfied. In other words, the WHERE makes the WHAT irrelevant. In a public setting, however, the WHERE makes the WHAT absolutely critical. In a public library above all, legal precedents have established that protections of Free Speech are quite strict. In a public library the purchaser of the filtering program is a government agency, which has a constitutionally limited authority to dictate to private citizens what they may read or look at. In a public setting, blocking needs to be more carefully tailored to filter out only materials that are Child Pornography, Obscenity, and material that is Harmful to Minors (if accessed by a minor). Blocking access to more than this – for either adults or minors – may violate the First Amendment and may result in a Free Speech lawsuit against the library and its city or county.
Again, none of this is to claim that Internet Filters don’t work at all or shouldn’t be employed. The purpose of this series of articles has simply been to raise awareness of some of the real-world limitations of Internet Filters. Filters don’t control all the means of communication a computer user has, are sometimes designed to restrict access more broadly than the law requires or allows in the context of a public library, and are known to fail to meet their own design objectives some of the time. It is especially important for censorship proponents to learn that Internet Filters are NOT magic. Even with the best filtering products available, some library patrons are going to be able to gain access to pornographic materials, and minors may not be protected from all the means of communication an internet predator might use. Filters could possibly be made more restrictive, but this will accomplish nothing if a court orders the filters deactivated because they infringed excessively on someone's Free Speech rights. They keys to accomplishing a modicum of protection with Internet Filters are balance and realistic expectations.
Previous articles in this series: