Safeguard Children Online with WebTitan
Learn About CIPA Compliance and Web Filtering Solutions Now!
The Children’s Internet Security Act has been in existence for roughly fourteen years now, but the dispute over web filtering in libraries and schools is still bantered back and forth between advocates on both sides of the debate. Giving all children access to the Internet computing was a rallying cry throughout the 1990s. President Clinton and Vice President Al Gore were strong supporters of getting schools wired to the Internet. They and others had a vision of a new world of mouse clicking and web surfing that would put unlimited informational resources in the hands of all children. The technology was to be the great equalizer, providing equal access to global information for everyone regardless of his or her social class and economic circumstance.
The Internet is commonly referred to as “The Wild West” however. Since the mid 90’s, it has been synonymous with porn and other dangers that are absolutely harmful for children. This dilemma concerning how to protect children while on the public Internet brought about the fruition of CIPA. In order to ensure CIPA compliance, however, you need a web filter. Web filters must be managed by someone of course and there is where the resistance to CIPA becomes magnified. Below we have outlined the pros and cons of CIPA and the supportive arguments for and against it.
For the last several years, the American Association of School Librarians, AASL, declares one day during Banned Books Week as Banned Websites Awareness Day. Banned Books Week is the annual celebration of the freedom to read and the observance of the First Amendment and our inherent right to read. The inclusivity of Banned Websites Awareness Day is an attempt to spotlight what they see as a hypocrisy in that Americans are outraged at the idea of banning books, but do not take offense at the banning of websites. No one can refute that web filtering creep takes place, blocking more than just pornographic images and offensive material.
Protect your users and data today - book a free trial of WebTitan web security solutionTry for Free
One of the cons of CIPA and its requirement of a web filter is that administrators view the Internet as it was in the 90’s, a digital document repository. The fact is, however, that many school districts block social networking sites. These social networks are a natural part of how the modern workforce communicates with one another. The ability to master social media and network skills can be a valuable asset to help further one's career. Many schools also block YouTube and other video streaming sites. While these sites certainly contain questionable material, they also contain informative videos of high-value information.
Some organizations such as the ACLU state that web filtering is sometimes an act of discrimination. They point out that many schools filter out LGBT sites for instance. The ACLU recently launched a “Don’t Filter Me” campaign to spotlight this over filtering practice. Students and teachers have complained of the inability to access certain online magazines such as National Geographic that is sometimes blocked for nudity or websites about countries such as Iran, China, and Russia that are often blocked due to the hacking reputations of those countries.
The most interesting concern over CIPA is what some refer to as digital redlining. The purpose of E-Rate was to eliminate the digital divide between areas of economic disparity. E-Rate has in fact put computers in the hands of community citizens in rural and urban areas, giving them access to the Internet and its vast learning reserves. Some argue though that CIPA has created a new digital divide however when it comes to Internet access. The inequality exists because children in wealthier areas who are overly filtered at school can go home and access sites on their own family devices that are inaccessible while at school. Even kids that receive school-issued laptops that go home with them can still escape the filter at night on family personal devices. This is not the case for kids in many urban and rural areas. They do not have the means to escape the filter, thus locking them out of popular sites such as social media and video streaming at all times. The end result is the creation of two classes of students, each having a different level of access.
The library association is one of the biggest critics of CIPA. They like to point out that they do not ban books from their patrons. True, they do not ban books; they just do not select them for availability. Few libraries if any have any carry any pornographic magazines or picture books. They do not carry books on how to carry out terroristic threats, make or distribute illegal drugs or spout racism. There are even books of classic literature that are not found in many libraries. Is their choice not to carry these books an act of banning them? Of course not.
Which is why the library cannot be compared to the Internet. It is apples and oranges. Patrons of the library cannot read access books in the library that are not on the shelf. In the case of the Internet, everything is there to be viewed by everyone connected, including child pornography. To argue against the filtering of child pornography is preposterous.
CIPA does not encourage or discourage school districts to filter the internet beyond the minimum requirements. If a school system chooses to do so, it is a local decision. Schools must serve the students, parents and community taxpayers and religious schools or schools located in more conservatively cultural areas have to consider their needs and expectations. What’s more, web filtering is a necessary security tool for enterprises today. Malware deployment and typosquatting sites are two categorical examples of malicious sites that must be blocked for the security of users. Filtering is not always about blocking content, often s it is about keeping users and their devices safe from malicious threats such as worms. Again, it is a local decision to do so, but often times a necessary one.
The issue of filtering students at home is becoming a mute issue due to the proliferation of cell phones. Often the cell phone is the preferred device for some types of sites such as social media and message services. The combination of a school provided laptop combined for educational research and the use of a personal smartphone for social use students with appropriate access.
Protect your users and data today - book a free trial of WebTitan web security solutionTry for Free
The issue of filtering in any form of capacity will always be debated in a free society, as it should. Regardless of various pros and cons to the issue of CIPA and web filtering, CIPA is a federal requirement concerning the procurement of E-Rate funds and there is no incentive within Washington to alter or terminate CIPA as it is today. CIPA is here to stay.
Start protecting your data, your students and your employees today. Get in touch to learn more about WebTitan content filtering (an E-Rate eligible solution).
Web filters work either by blocking web content based on the quality of the website itself, or by blocking evaluating web page content in real-time. The first method requires the web filter to consult a list of harmful pages and match their signatures with the content users wish to access. The second method uses a more sophisticated approach to categorize content as it appears on the display.
Indirectly. The laws that regular video game content were created before mobile games were widely available on children’s mobile phones. However, the Children’s Internet Protection Act does specifically forbid app developers (including video game publishers) from collecting and using sensitive information from minors.
New laws are necessary, but many attempts have been made to block minors from accessing harmful content, including the Child Protection Act from 1998 and the Social Media Child Protection Act of 2023. Some of these proposed rules violate First Amendment rights. Others place unreasonable expectations on social media companies – like knowing whether an anonymous user is a minor.
Schools must implement filters that block access to content that is obscene, pornographic, or harmful to minors. The regulation also requires schools to monitor the online activities of users – which includes both students and adult users alike. Schools must adhere to these regulations and prove that their internet policies are compliant.
One of the major weaknesses of state-led internet protection regulation is the fact that children can generally access harmful content on their own at home. Nothing prevents one student from sharing explicit imagery with the rest of their class. Internet protection rules only work when schools and parents work together to limit childrens’ access to harmful materials.
Children may navigate to harmful websites on devices outside the school’s control. They may also share inappropriate content with one another. Some children are capable of using technical exploits to bypass school filters and access harmful content directly. Children who own their own devices may also resist any attempt to limit their exposure to the internet.
Filters can block access to websites known for distributing harmful or pornographic content. They can’t prevent minors from engaging in inappropriate behaviors on other websites. Web filters can’t protect children from hackers and predators who abuse websites designed for children, or from hate speech delivered in online gaming environments.
The Children’s Internet Protection Act does not specifically name particular websites. That means that individual schools and organizations are responsible for creating lists of inappropriate content on their own. It also means that some questionable forms of content (like media depicting self-harm) are not covered.
The main arguments for internet filtering revolve around obvious dangers of hackers and predators abusing web technologies to target children. The arguments against internet filtering involve limitations to free speech, technical difficulties preventing children from accessing harmful content, and clearly defining the difference between “harmful” and “non-harmful” content in a consistent way.