Sunday, January 3, 2010

Internet Filters Underblock and Overblock.

(Second in a series of five articles on Internet Filters)

Internet Filters can be a highly effective means of protecting children from pornographic materials they might find on the internet.  However, filters are not perfect, having several kinds of problems that impact effectiveness and can result in legal entanglements.  One of the better documented of these problems is that Internet Filtering programs both underblock and overblock access to websites.
Underblocking occurs when a filter incorrectly allows a computer user to display text or images that should be blocked.  Allowing a child to access a pornographic picture would be an example of underblocking.
Overblocking occurs when a filter incorrectly prevents a user from displaying legitimate text or images.  Preventing an adult from accessing information on birth control methods would be an example of overblocking.
One can measure the rate of underblocking and overblocking experimentally, and the rates of one filtering program can be compared with the rates of other filtering programs.  One way to do this would be for human beings to pick a finite set of existing web pages and to review these manually, identifying which should and should not be blocked.  A computer with a particular filtering program can then be used to attempt access to each of those identified websites, keeping count of the successes and failures of the filtering program compared to the manual decisions.  If the filter allows access to 6 out of 100 pages that the humans said should be blocked, that would be an underblocking rate of 6%.  If the filter denies access to 9 out of 100 pages that humans said should be allowed, the overblocking rate would be 9%.  The two rates operate independently, so they won't add up to 100%, and a change in one number doesn't automatically change the other.

Although overblocking and underblocking operate independently, and all filtering programs have some of both, there is a general tendency for filtering programs to do better by one measure and worse by the other.  This results from design considerations that tend to make one filtering program more restrictive or more permissive compared to another filtering program.  A filtering program designed for parental controls on a home computer will usually be more restrictive, underblocking less and overblocking more, in large part because the user whose access will be filtered is presumed to be a minor.  A filtering program designed for corporate use, in an environment in which most computer users are adults, will tend to be more permissive, underblocking more and overblocking less.

Such experiments have been done, and the results of some of those studies were entered into evidence in a court case known as ACLU v. Gonzales, decided in 2007 in the US District Court for the Eastern District of Pennsylvania.  Censorship proponents are fond of citing this case as indicating that filters are "95% effective."  The 95% figure is probably accurate in an abstract sense, but glosses over some important details (see especially page 37 of the decision).

Reading the court decision with some attentiveness, it is clear that the 95% figure is an inverted measure of the underblocking rate.  In other words, the evidence given to the court was that filters prevented access to sexually explicit webpages 95% of the time, which means that the filters incorrectly let the sexually explicit images through -- they underblocked -- 5% of the time. Some individual filtering products were better than this, while some were less accurate, but most were close to this figure.

The overblocking figures cited in the court decision were quite different. Overblocking varied greatly, from 2.9% to 22% to has high as 32.8%.  In other words, filtering products varied widely with regard to how often they incorrectly blocked access to legitimate websites.

While rating an Internet Filtering program as "95% effective" may sound great, it is important to remember that such a number is only useful for comparing one product with another, and has no meaning in any absolute sense.  There is no "good" number for the overblocking or underblocking rates, other than zero.  Given that the internet is made up of billions of pages, always increasing and always changing, an error of only 1% can mean millions of pages to which access is incorrectly permitted or incorrectly denied.

In a private setting, the consequences of filtering errors are minimal.  A parent who installs an Internet Filter on a home computer is unlikely ever to notice, let alone complain about, overblocking of the children's internet access.  A parent who believes the installed filter is underblocking, letting the children see inappropriate materials, has little recourse other than to return the product for a refund.

The stakes are higher in a public setting, above all in a public library.  The library must struggle to meet the needs of both adult and minor patrons, while scrupulously protecting the Free Speech rights of both.  A parent may become irate if a child gains access to pornographic images through a filtered library computer that underblocks.  On the other hand, a filter that overblocks, preventing adult access to legal websites, can easily put a library (and thereby the city or county of which it is a part) in a costly lawsuit.

There should be no doubt in anyone's mind that overblocking can violate the First Amendment.  The government's authority to prevent adult access to protected speech, especially inside a public library, is extremely limited. Internet filters, by their nature, discriminate on the basis of expressive content. At one point in time on one computer, one webpage will be accessible while another will not, so it is quite difficult to argue that the filter is just a restriction in time, place or manner of receiving protected information.  Overblocking, then, begs for strict scrutiny, the most stringent analysis a court can give to potential infringements on Free Speech.

For an adult patron, the simplest way for a library to fix an overblocking problem when one crops up is to disable the filter completely for that one patron on that one computer at that one time.  Both the CIPA itself and  the US v. ALA decision that legitimated it allow this option. Whether this can be done in a timely manner depends on the design of a particular filtering program.  Some products might allow a registered adult patron to disable the filter without staff intervention, while other products might require a library employee with sufficient privileges to enter commands on a central computer.  Regardless of what is involved, time is of the essence. A library that reacts too slowly to a complaint of overblocking is likely to wind up in court.

In a situation in which a minor is being blocked from a legitimate site, the procedures will necessarily be more complicated.  Disabling the filter on the minor's computer could allow that minor to access sexually explicit materials (intentionally or accidentally), and in a library bound by the Child Internet Protection Act, might violate that act. Adding the blocked site to the White List (of always permitted site) might be feasible, but only if the requested site is appropriate for minors of all ages, and not just for the patron complaining of overblocking. The one option the library does not have is to ignore the problem. Minors do have Free Speech rights, and blocking access to sites a minor should be allowed to access can result in a lawsuit just as surely as blocking and adult's access.

To sum up, Internet Filters can be an effective, but not perfect, means for protecting younger children from inappropriate materials on the internet.  In public settings, especially in a public library, overblocking can have Free Speech entanglements that result in costly lawsuits.  For adults patrons, it is often most practical simply to disable the Internet Filters on demand.  For older minors, however, overblocking may become a time-consumptive administrative headache, as well as a lawsuit risk.

Previous article in this series: 
Future articles in this series: 
Can Internet Filters Identify Obscene Images? 
Internet Filters and Email, Chats, and Attachments.
Internet Filters: The Constitutional Headache

2 comments: