Sour Sites

This week will continue with the theme of keeping kids safe online. In the previous post, we covered smartphone apps that can be trouble. Parents can have some level of control with those apps by merely inspecting their student’s phone or tablet on a regular basis. What about when your child is at school?  What measures is your school district taking to keep your child safe from unwanted content online?

Are schools required to filter the Internet?
Yes IF they receive federal E-rate funding. E-rate requires that the school district be compliant with CIPA. CIPA, the Child Internet Protection Act, of 2001 states that schools use “a ­technology protection measure with respect to any of its computers with internet access that protects against access through such computers to visual depictions that are obscene, child pornography, or harmful to minors.” Most public schools chose to apply for E-rate funding and therefore, must implement a content filter to the Internet access they provide to students.

How does this filtering work?
It is essential for anyone reading this to understand why and how websites are blocked. Most schools have a content filtering appliance in place that controls what is blocked or not blocked. Sites are categorized based on content (news, entertainment, sports, gaming, etc). This device and its categorization algorithms account for about 99% of all the filtering that occurs in the school.
CIPA requires that specific content is filtered such as pornography, but it is up to the district to choose which additional categories to allow and which ones to filter out. There is an ongoing debate among educators and IT professionals as to what those groups and sites should be. Some believe filtering is too restrictive and doesn’t provide students with an authentic learning experience. Others believe in a more locked down approach to help keep students safe and focused on learning. In my district, we take a more progressive approach allowing students access to sites such as YouTube and Twitter.

It is not perfect…
Content filtering devices use predetermined categories to filter out inappropriate material for schools. One of the issues with this is the sheer volume of new websites that appear every day. In this post from 2013, it is estimated that 571 new websites are launched every single minute. It can take weeks or even months before the primary content filtering providers categorize these new sites. At any one time, there are millions of sites that are not classified. While not every one of those sites is inappropriate, there are enough out that to confidentially say a student could find one.
The other problem with filtering is that sites can be miscategorized or unwanted material can be loaded onto websites that can be appropriate for education use. A fantastic example of this is the site below. is not a gaming website, but this site includes hundreds of online games. is a tool that Google created to host to millions of websites on just about every topic imaginable. If the district were to block, that would include blocking out too many perfectly acceptable sites including student-created projects and portfolios.

This is just one of the ways schools can help to keep students safe online.  Safe search, machine learning, and digital citizenship curriculum all fall into this category as well.

What does all this mean to parents?
School districts use a variety of methods to help keep students safe online, but filtering out every single undesirable website on the Internet is almost impossible. Each district decides how open or restrictive the content filtering is. You should feel confident that there are protections in place. However, no filter is 100% perfect. If you have a question specific to your district, send an email to the director of technology requesting more information. I’ve created a sample email you can use that will get the conversation going.

“Good Morning,
I am interested in learning more about content filtering in our district. How does the district decide what sites to allow and which ones to block? I do not have any concerns at this time, but if I do in the future, who is the best person to direct those questions to?”

What does all this mean to educators?
Content filtering should be a collaborative effort. Educators and district leaders need to work together to develop a level of screening that works well for learners of all ages. This doesn’t mean that a committee needs to review every unblock request. The technology leadership should be able to handle the day to day requests based on his/her judgment. I’m advocating for a partnership to determine the initial filtering levels of which categories to allow and which to block. Adjustments can be made as the school year goes on based on feedback from everyone involved, including students.  This also underscores the need for a strong digital citizenship curriculum and implementation plan.

If your child does access some unwanted content, the best thing to do is have a conversation with them about it before making any assumptions. This week’s assignment is to talk with your student(s) about what they should do when unwanted content appears on their screen both at school and at home. Ask them if they’ve ever encountered a situation like this. Ask them what happened that resulted in the inappropriate material. Some questions to consider:

  • Did you click a link or advertisement from another site?
  • Was it a pop up that just appeared?
  • Was this a result of an innocent search that returned unwanted results?  Did you mean to search for this?

Mistakes can happen. Also, kids are curious. The conversation should revolve making smart choices when something unwanted does happen, innocently or otherwise.