Content-control software

From Wikipedia, the free encyclopedia

DansGuardian blocking whitehouse.com, a risqué political site.
Enlarge
DansGuardian blocking whitehouse.com, a risqué political site.
Part of the series on
Censorship

By region

Australia
Bhutan
Canada
P. R. China
Taiwan (R.O.C.)
East Germany
France
Germany
India
Iran
Republic of Ireland
Pakistan
Samoa
Singapore
South Asia
Soviet Union
Thailand (Radio and film)
United Kingdom
United States

By media

Advertisements
Books
Films (banned|re-edited)
Internet
Music
Anime
Video games

Other

Self-censorship
Book burning
Content-control software
Corporate censorship
Under fascist regimes
In religion
Historical revisionism
Postal censorship
Prior restraint
Tape delay
Whitewashing

This box: view  talk  edit

Content-control software, also censorware and web filtering, is a term for software designed and optimized for controlling what content is permitted to a reader, especially when it is used to restrict material delivered over the Web. Content-control software determines what content will be available on a particular machine or network; the motive is often to prevent persons from viewing content which the computer's owner(s) or other authorities may consider objectionable. Common use cases of such software, at least in the United States, include parents who wish to limit what sites their children may view from home computers, schools performing the same function with regard to computers found at school, and employers restricting what content may be viewed by employees while on the job.

Frequent subjects of content-control software include web sites that, according to the company providing the control, are alleged to :

Content-control software can also be used to block Internet access entirely.

Contents

[edit] Terminology

This article uses the term "content control," a term also used on occasion by CNN [1], Playboy Magazine [2], the San Francisco Chronicle [3] and the New York Times [4]. However, two other terms, "censorware" and "web filtering," while more controversial, are often used.

Companies that make products that selectively block web sites do not refer to these products as censorware, and prefer terms such as "'Internet filter"; in the specialized case of software explicitly designed for parents to monitor and restrict the access of their children, "parental control software" is also used.

Those critical of such software, however, use the term "censorware" freely: consider the Censorware Project, for example. The use of the term "censorware" in editorials criticizing makers of such software is widespread and covers many different varieties and applications: Xeni Jardin used the term in a 9 March 2006 editorial in the New York Times when discussing the use of American-made filtering software to suppress content in China; in the same month a high school student used the term to discuss the deployment of such software in his school district [5].

Seth Finkelstein, an anti-censorware advocate, described what he saw as a terminology battle, in a hearing at the Library of Congress in 2003:

I think the best public relations that the censorware companies ever did was to get the word "filter" attached to their products. When you think of a spam filter, for example, you think of something that you do not want to see.
But, again, as I said earlier, censorware is not like a spam filter. What censorware is, is an authority wants to prevent a subject under their control from viewing material that the authority has forbidden to them. This description is general. [6]

In general, outside of editorial pages as described above, traditional newspapers do not use the term "censorware" in their reporting, preferring instead to use terms such as "content filter," "content control," or "web filtering"; the New York Times and the Wall Street Journal both appear to follow this practice. On the other hand, web-based newspapers such as CNET use the term in both editorial and journalistic contexts, e.g., "Windows Live to Get Censorware."[7]

[edit] Issues

Filters can be implemented in many different ways: by a software program on a personal computer or by servers providing Internet access. Choosing an Internet service provider (ISP) that blocks objectionable material before it enters the home can help parents who worry about their children viewing objectionable content.

Those who believe content-control software is useful may still not agree with certain ways it is used, or with mandatory general regulation of information. For example, many would disapprove of filtering viewpoints on moral or political issues, agreeing that this could become support for propaganda. Many would also find it unacceptable that an ISP, whether by law or by the ISP's own choice, should deploy such software without allowing the users to disable the filtering for their own connections. In addition, some argue that using content-control software may violate sections 13 and 17 of the Convention on the Rights of the Child. [citation needed]

[edit] History

As the World Wide Web rose to prominence, parents, led by a series of stories in the mass media, began to worry that allowing their children to use the Web might expose them to indecent material. The US Congress responded by passing the Communications Decency Act, banning indecency on the Internet. Civil liberties groups challenged the law under the First Amendment and the Supreme Court ruled in their favor. Part of the civil liberties argument, especially from groups like the Electronic Frontier Foundation, was that parents who wanted to block sites could use their own content-filtering software, making government involvement unnecessary.

Critics then argued that while content-filtering software might make government censorship less likely, it would do so only by allowing private companies to censor as they pleased. They further argued that government encouragement of content filtering, or legal requirements for content-labeling software, would be equivalent to censorship. Although prohibited from doing so by various copyright laws, groups such as the Censorware Project began reverse-engineering the content-control software and decrypting the blacklists to determine what kind of sites the software blocked. They discovered that such tools routinely blocked unobjectionable sites while also failing to block intended targets. An example of this tendency was the filtering of all sites containing the word "breast," on the assumption that this word could only be mentioned in a sexual context. This approach had the consequence of blocking sites that discuss breast cancer, women's clothing, and even chicken recipes. Similarly, over-zealous attempts to block the word "sex" would block words such as "Essex" and "Sussex." Content-control software has been cited [8] as one of the reasons Beaver College had to change the name to Arcadia, since content-control software had been blocking access to the college web site.

Some content-control software companies responded by claiming that their filtering criteria were backed by intensive manual checking. The companies' opponents argued, on the other hand, that performing the necessary checking would require resources greater than the companies possessed and that therefore their claims were not valid.

Many types of content-control software have been shown to block sites based on the religious and political leanings of the company owners. Examples includes blocking several religious sites [9] [10] (including the web site of the Vatican), many political sites, and sites about gay/lesbians. [11] X-Stop was shown to block sites such as the Quaker web site, the National Journal of Sexual Orientation Law, the Heritage Foundation, and parts of The Ethical Spectacle. [12] CYBERsitter blocks out sites like National Organization for Women. [13]

The site Peacefire.org posted information about some pages that were blocked and it was then added to the blocklist. Solid Oak Software has vowed that Peacefire's reports about CYBERsitter "will be blocked wherever they may be." [citation needed]

[edit] Use in public libraries

Content-control software such as SonicWALL is used in many public libraries in the United States to block content classed as objectionable because of pornography or advocacy of violence. Some libraries that employ content-control software allow the software to be deactivated on a case-by-case basis on application to a librarian.

Many legal scholars believe that a number of legal cases [14], in particular Reno v. American Civil Liberties Union [15], established that the use of content-control software in libraries is a violation of the First Amendment. The Children's Internet Protection Act [CIPA] and the June 2003 case US v. ALA found CIPA not unconstitutional on its face, but left open a future "as-applied" Constitutional challenge. However, the American Library Association maintains its stance that "ALA policy is unchanged: ALA does not recommend the use in libraries of filtering technology that blocks constitutionally protected information." [16]. In November 2006, a lawsuit was filed against the North Central Regional Library District (NCRL) in Washington State for its policy of refusing to disable restrictions upon requests of adult patrons.

Australia's Internet Safety Advisory Body has information about "practical advice on Internet safety, parental control and filters for the protection of children, students and families" that also includes public libraries. See NetAlert Limited.

Denmark will "prevent inappropriate Internet sites from being accessed from children's libraries across Denmark."[17] "'It is important that every library in the country has the opportunity to protect children against pornographic material when they are using library computers. It is a main priority for me as Culture Minister to make sure children can surf the net safely at libraries,' states Brian Mikkelsen in a press-release of the Danish Ministry of Culture."[18]

[edit] Bypassing filters

Some software may be bypassed successfully by using alternative protocols such as FTP or telnet, conducting searches in a different language, or using a proxy server. Some of the more poorly-designed filters can be shut down by killing their processes; for example, in Microsoft Windows through the Windows Task Manager, or in Mac OS X using Activity Monitor. Numerous work-arounds and counters to work-arounds from content-control software creators exist.

Many content filters have an option which allows authorized people to bypass the content filter. This is especially useful in environments where the computer is being supervised and the content filter is aggressively blocking web sites which need to be accessed.

[edit] Content-control software products

As described above, many content-control software products as well as the concept of content-control software in general, especially in government-funded services or those not age-restricted, can be controversial. Many ISPs offer parental control options, among them Earthlink, Yahoo!, and AOL; and more general software such as Norton Internet Security includes "parental controls." Mac OS v10.4+ offers parental controls for several applications (Mail, Finder, iChat, Safari & Dictionary). The upcoming Windows Vista operating system also includes content-control software. See the Censorware category for a number of articles on content-control software products.

[edit] See also

[edit] External links