Among those who do believe in varying levels of appropriateness of content online, there is disagreement about how this problem should be solved. Let me venture to say first that forced Internet filtering is not the correct answer.
What I mean by forced Internet filtering is something akin to what you would see in the HBLL library at BYU. Internet filtering software is installed on a separate computer, called a proxy, and every other computer has to connect to the Internet through that proxy. The filtering software restricts access to a list of sites, called a blacklist, that are suspected to contain objectionable content. The software also analyzes the content of each website that is requested by every computer and determines automatically whether that site is appropriate or not. If you try to navigate to a website that is deemed inappropriate, you instead get a webpage that effectually says, "Shame on you. Your attempt to look at pornography has been logged and reported."
Why doesn't forced Internet filtering work?
Forced Internet filtering, in its present form, does not work for the following reasons:
Too ambitious for a packaged software solution
Automatically categorizing all web sites as appropriate or inappropriate is a lofty goal - a worthy goal, even - but there are so many different types and degrees of inappropriateness on the Web in so many formats that a single packaged piece of software cannot provide a satisfactory filter without a great deal of human oversight.
Technical solutions give non-technical parents a false sense of security
The technical boundaries that most parents place on their children are easily overcome. I speak from close personal experience. My parents thankfully did not attempt to use an installed Internet filter, but I'm sure it wouldn't have mattered much. Instead, there was an Internet password. You might be surprised to hear this, Mom, but learning that password was easier than taking candy from a baby, and I never had to watch you type it in. The BYU Internet filter is trivially easy to circumvent. It's useful too, as I'll explain in the next section, because BYU blocks practically half of the Internet.
Not only can geeks like me get past a typical filter, but so can inappropriate content. It's not very much trouble at all for the content to evolve into a format that makes it past a filter. If you filter out all swear words, people will come up with new ways to spell or abbreviate them that are easily readable and understood. Media can be encoded, encrypted, and compacted in all kinds of different formats. Websites can be accessed through other websites. It's almost like the illegal drug trade. If you outlaw it, it will move underground.
Sometimes there is no easy way to get around the filter. This can be the case if you work in a large corporation containing an entire IT division with the primary mission of locking down and restricting everything you do and touch while on the job. Even if you do break through the Iron Wall somehow, you might later get sacked like the little expendable peon that you are. I don't recommend working in such places for various reasons, but this is not really the environment I'm talking about anyway.
Unacceptable amount of false positives
False positives are incidents when the filtering software falsely classifies a website as containing inappropriate content. While working at BYU as a web developer and as a research assistant, I constantly ran up against the "shame on you" page when trying to access legitimate websites. Colleagues of mine all had the same experience. Since I don't access pornographic websites (not even accidentally - imagine that!), from my point of view this filter did nothing other than prevent me from getting information I needed at inopportune times. There is no explanation for this sort of thing other than an inhuman automated declaration that you have accessed a site that you shouldn't have. You might as well be talking to a phone rep at an electric company.
If you think it's bad at BYU, try going to someone's house where they've installed some shrink wrap web filtering software - the kind that's marketed to the same kind of suckers who will install "anti-virus software" from a pop-up window. If you try to browse the web, you'll soon find that, little did you know, up to this date you've been swimming in the eternal lake of fire and iniquity and your all-but-lost spirit is shrouded in a ponderous chain that you have labored on every time you logged onto your computer! Never again shall you roam the wicked shores of the Island of Phelps. Let the heavens have mercy upon your soul.
When I talk about granularity, I'm talking about the smallest unit of information that can reliably be blocked by an Internet filter. Up to this point that unit has been entire domains. A domain is the first part of the URL for a website, like www.google.com. BYU blocks YouTube (at least they did when I was there). Every single thing on YouTube, no matter how vile, or how enlightening, is blocked. That rules out a gigantic chunk of useful information on the Web as we know it today. It just so happens that it rules out a lot of cat videos and completely inappropriate videos too. I suppose the existence of the latter category and the collective IQ score of the associated comments for every video (-493e233) was the reasoning behind blocking it. It sounds like there's a good case to be made to someone high up at BYU for blocking the entire Internet. Some Internet filters block entire blogging domains like blogspot.com and wordpress.com.
All complaining aside, the fact is that the terrible filtering software can do absolutely nothing about separating the good videos from the bad because they're all technically on the same website: YouTube. With such a great software tool like that, your choices are: ban everything or allow everything. Dealing with the comments is a trivial matter: they're all stupid, so just delete them from the page. Yet the web filtering software can't even do something as simple as that! Why not? Well frankly it's too much work for the developers of a packaged piece of software to put in special features for blocking specific parts of a lot of different domains. The software would have to detect the stupidity of the comments and delete them automatically, and while there's work being done in that direction (stupidfilter.org), there is still a long way to go.
Finally the most important reason why forced Internet filtering does not work is simply because it's forced. The CS department at BYU used the Blue Coat web filtering software. We made up a slogan for it that we should have put on a T-shirt or something:
Blue Coat: Satan's plan for web filtering
When you forcefully take away the choice of what you can and cannot access online, you create a sort of prisoner culture. Within such a system, it is easy to feel that you are justified in getting away with whatever you can within the constraints of the system. Not only have the parents left the policing up to the Internet filtering software, but so have the kids. Getting to a website without being blocked by the filter must mean it's okay. Everyone is absolved of responsibility. It becomes a game almost: my parents installed a forced filter; therefore they don't trust me; therefore I feel no obligation to keep their trust; therefore it's simply their will against mine. Pretty soon I'll have found a way to get the content I want and I think, "Hah! I win this round! Bring it on Mom and Dad!"
Now obviously there are varying levels at which you can trust a child with the responsibility of browsing the Web depending on the child's personality and age, but strict, parent-controlled Internet filtering is not going to grow or adjust well for teaching different levels of accountability, and I've explained why a typical Internet filtering solution does not work even if it isn't parent-controlled.
Now that I've talked about what doesn't work, maybe we can start thinking about what might work...
What do you think about accountability software options (like Covenant Eyes) when it is freely chosen by an individual?
From what I gleaned, the Covenant Eyes accountability software tackles a different problem: helping people who are struggling with pornography pull out of it by consenting to be monitored by a trusted party. It sounds like a good idea to me, but I'm not familiar with that kind of problem.
The Covenant Eyes filtering software on the other hand suffers from all the above.
I thought what you said was really interesting, especially the part about filters making people feel less accountable because if they could get away with it, then it clearly wasn't wrong... I dated a guy at BYU who (I later found out, much to my horror) had figured out how to get to all the porn sites by using other languages. Clearly, he thought that made it okay. (Shudder) However, I do like the idea of having filters when my kids are little so they don't accidentally stumble onto a site they shouldn't see... But that's for when they're, like, five. Not for much older than that, probably! I also think ALL of this is why it's so important just to keep the computer where it's highly visible so it's easier to police the whole thing...
I agree with a lot of what you have to say here. I do think filters are good ideas for the purpose of stopping unnecessary and unintentional exposures to pornography online for young children (which can happen even if I'm sitting with my child while on the computer), but overall I think accountability software is a better option.
Post a Comment