Any time the government attempts to filter the public’s access to information, there’s bound to be controversy. Such is the case with the current debate over public Internet access and the filtering software that many of these computers have installed on them. On one hand, you obviously do not want a pervert sitting in a public library salivating (or worse) over images of a pornographic nature, meanwhile children sit reading about George Washington in the background. At the same time, the very notion of information filtering by the government (regardless of what kind of information is being filtered) rubs many people in the wrong way.
One of the issues with filtering software is the fact that, by the very nature of their design, they often deem appropriate material to be unacceptable. This could interfere with research on a topic such as breast cancer, simply because of the presence of the word “breast” on a website. By making this material inaccessible, the software crosses the line from blocking obscene material to impeding the computer’s functionality. Another issue with filtering software is determining the definition of what is obscene material. While we would all pretty much agree that photos of naked Asian schoolgirls rise to the threshold of unacceptable content, but what about the website of the Communist Party of the United States? What about neo-Nazi organizations, or pro-life organizations, or maybe even pro-choice organizations? One person’s idea of obscene material might be another person’s perfectly acceptable political cause, religious belief, or research project. In this sense, a “slippery slope” argument could be made.
The Term Paper on Blocking Filtering Software Internet Sites
'Any content-based regulation of the Internet, no matter how benign the purpose, could burn the global village to roast the pig.' U. S. Supreme Court majority decision, Reno v. ACLU (June 26, 1997) Blocking and filtering software for the Internet is one of the most hotly debated topics regarding free speech and the Internet. Many have criticized blocking software for being both under and over ...
Those in power, if given the ability to selectively block access to objectionable information without very specifically laid out guidelines, would likely abuse their authority for the advancement of their own agendas. A knowledgeable computer user can also, quite often, disable filtering software. While this might not be as much of a problem with elementary aged children, if teenagers and adults (probably the most likely to seek out inappropriate material) can get around the software then what good does it actually do? Still, as these computers are purchased with public funds, and therefore public property, some sort of monitoring or control does seem appropriate. A committee appointed by an elected body at the local municipal level, which eventually answers to the voters of the community, could perhaps provide the best method of laying out exact guidelines as to what is and is not censor able material. Thus, in theory, the definitions of obscene and objectionable material would reflect the local community’s standards and values by making those in power accountable for the way in which they exercise their control.
Another less complicated – and less formalized – solution might be to simply place all of the computers in highly visible areas and allow librarians the ability to exercise common sense and personal judgment in ejecting anyone who attempts to access pornographic or disruptive content.