Friday, March 23, 2007

Parental Controls: Best Practices

The Child Online Protection Act of 1998 (COPA) received its final body-blow yesterday in the form of a dismissal ruling by Judge Lowell Read of the US Federal Court.

Siding with the ACLU, Judge Read said that while he was personally in favor of restricting a child's access to porn, First Amendment rights were more important. He also made the point that there are number of technologies out there designed to enable parents and teachers to put filtering and monitoring in place.

Despite its failings, I'm a big fan of the First Amendment: and I'm glad it's the First Amendment because to me that underscores its fundamental nature. So I would rather keep the First and solve this problem through technology.

As the world's leading licensee of those technologies, we have seen our fair share of filtering approaches. Some are incredible. Some don't work very well, and some don't work at all. I remember one famous vendor demonstration at a trade show in New York during which not a single site (out of fifty) was blocked. That company is no longer in business.

Aside from the methodologies related to analysis, policy-setting can be an issue as well. I remember another incident when we were testing the original PICS voluntary policy controls and came across a "naturist" (read: nudist - see comment below) site that had given itself the equivalent of a "G" rating. I called the guy up. "Everyone should see how we live - there's nothing wrong with it", he said to me.

Clearly he was not looking at the same shots I was.

The folks that have been around a while - PureSight, RuleSpace, SurfControl - and the folks that have good feedback systems built into their products, such as Fast Data Technologiers, NetSweeper and WebSense - have built some impressive systems, and, on aggregate, have created a situation where policies can be created for upwards of three quarters of a billion pages, across scores of categories.

With that much filtering going on, is a law really needed? Probably. While some kids are no doubt capable of self-policing, not all kids have the same processing powers and sensibilities. The views of parents also vary widely - a majority of children interviewed in a 2004 survey said they needed *more* supervision from their parents, and 40% of them stated that their parents were wrong to trust them to do the right thing without guidance.

That's why laws are needed - for the same reasons that truancy laws, or gun-lock laws, or child seat laws are needed - because you can't always rely on every parent out there to understand their child's sensibilities, or know the right thing to do, and you shouldn't underestimate a child's need for guidance.

As for the technologies, we'll keep on fine-tuning our technologies and our policies, but my personal view is that involvement with your kids online activities - and knowledge of their interests - is by far the best parental controls policy.

3 comments:

Dan said...

Since you use as an example a Naturist (not Naturalist) site as an example, let me expand on that.

I have run several nudist websites over the past 14 years. Nudism is an activity practiced by many families, who also want to protect their children from inappropriate content.

Many unscrupulous people are labeling sites as nudist or naturist and using them to serve up pornographic content in an effort to attract visitors.

The problem arises when a legitimate and informative nudist site, that would otherwise be regarded as a valuable source of information about a widely accepted lifestyle, is lumped together with these porn & semi-porn sites by the filtering services, blocking access to it.

There needs to be more user input available in order to filter certain levels of a topic, rather than arbitrarily dumping them all.

While this shotgun approach makes it easy for the lazy user to protect their computers from every possible offensive site, it also allows useful information on the same level as Google users in China are allowed information about politically damaging topics.

John C. Sharp said...

I agree with your comment that we, and our ISP customers, should not take a "shotgun" approach to web filtering.

Potential examples of sites that need to be potentially excluded from filtering are child sex ed and breast cancer, or STD awareness sites.

The less obvious examples are sites that may not comply with societal norms, such as your naturist sites. Where global policy settings are implemented, these policies usually end up skewed to the majority.

In instances where global restrictions are not applied, I think technology really is the answer - self-regulation can enable those parents that follow alternate life-styles to make decisions that they believe will impact their families positively.

For some families, that may be gun-education, for others, it may mean sex education. For others still, it may mean blocking of all sites other than those supporting a particular world view, or religion.

Thanks for your submission, Dan.

fun said...

Well can someone explain to me why the xxx domain for porn sites was rejected?

I apprecite the criminal element omnipresent will simply ignore the regulation, but legitimate vendors of pornography will want to comply and that in itself will reduce the instances of innocent click through.

I believe a dedicated xxx domain allows the playboys and penthouses of the world to place their banner ads whereever they want, safe in the knowledge that a schools filtering system has to do nothing more than "allow access to.xxx_no".