Content Filtering Explained

If you’re in charge of the public computer lab at your school or library, chances are you’re familiar with the Children’s Internet Protection Act (CIPA), a U.S. law that aims to restrict children’s access to online content that has been deemed inappropriate or harmful to minors. Under the CIPA, schools and libraries that wish to receive federal funds to purchase technology equipment must implement protection measures to help shield minors from objectionable online material.

To comply with the CIPA, many libraries and schools are turning to content filters, hardware or software that restricts what sites and pages users can access. While schools and libraries should consult the full text of the CIPA (PDF) to learn more about what kind of a content-filtering system will help them comply with this law, other organizations that operate public computer labs (such as community centers) have more flexibility and can make their own content-filtering rules.

Regardless of whether or not you must comply with the CIPA, having a solid grasp on the fundamentals of content filtering is the first step toward successfully introducing it to your computer lab and to the people you serve. To help you quickly understand how content filters work and gain insight into issues surrounding this technology, we’ve compiled the answers to a handful or frequently asked questions.

1. What are content filters and how do they work?

A content filter is a piece of hardware or software that acts a shield between the Internet and a user’s computer, blocking access from potentially objectionable or offensive material. Most content filter manufacturers compile a list of sites they deem objectionable and classify them under different profiles, which often pertain to the end user’s age.

For instance, a content filter’s most aggressive blocking profile might be designed for children under 10 and would therefore restrict all access to a large range of materials, such as pornography; pages about illegal drugs; sites that deal with sex education; and sometimes even social-networking sites such as MySpace. On the other hand, profiles for adult users might allow most types of content forbidden to younger users yet still block the majority of sites that are known to install malware. If one of the filter’s built-in profiles is too restrictive or lax for your audience’s needs, you will often be able to create a custom profile or alter one of the presets to your liking.

In addition, content filters generally let you block any Web pages or search results that contain single or multiple instances of user-specified keywords. Many content filters also allow you to blacklist (always block) specific sites by entering their URLs. Note that content-filter manufacturers often provide automatic updates to their product’s list of objectionable sites in order to account for sites that have recently appeared on the Internet.

2. What are the potential pros and cons of content filtering?

Content-filtering technologies have sparked much controversy and debate. While advocates claim that this technology protects minors from harmful material and online predators, opponents (who often refer to it as “censorware”) believe that content filtering is inherently error-prone and can restrict access to educational or other important information

Libraries and schools may find that implementing a content-filtering solution on all of their computers can benefit them financially, as this will help them comply with the CIPA and qualify for federal funding for technology-related purposes. Other types of organizations that offer public-access computers to children or youth groups might also find content filters beneficial because they can reduce liability and help cut down on phone calls or visits from distressed parents.

Another potential benefit of installing a content-filtering system is that it can help decrease the amount of malware that your patrons inadvertently install on your machines. Many content filters keep lists of sites known to install malware and prevent users from accessing them; also, if your content filter has a blacklist feature, you can manually block sites that you know install malware. Note that a content filter should never be considered a substitute for dedicated anti-malware and antivirus programs.

Despite the aforementioned benefits, content filters are not without their downsides. Several predecessors to the CIPA were successfully struck down by the Supreme Court, partially on the grounds that they violated an individual’s right to free speech. The American Library Association (ALA) has also challenged the CIPA in court, though the Supreme Court eventually upheld the law as constitutional.

Other critics of content filtering — including the Electronic Frontier Foundation (EFF) and the American Civil Liberties Union (ACLU) — contend that the technology is also flawed because it often accidentally blocks useful materials that could be used for educational purposes. For instance, an aggressive content filter might block the term “breast,” which would prevent patrons from conducting legitimate research into topics such as cancer or anatomy. — an organization that advocates for youth freedom of speech online — has tested a number of popular content-filtering programs and has compiled a list of legitimate sites that were accidentally blocked.

Content-filter opponents also object to the technology because sites deemed objectionable are subjectively chosen by hardware and software manufacturers and not by a central, impartial organization.

Call Us

Enjoy this blog? Please spread the word :)