The World Wide Web Consortium is finalizing some extensions to its Internet content-labeling framework that will allow for simplified installation and use, enabling filters to be installed with a proverbial click of the mouse. This promises to bring a more widespread use of the Platform for Internet Content Selection technology - as well as heat up the debate over voluntary Net censorship.
The PICS specs are in review, but will be made official recommendations by year's end, a W3C spokesperson said. Foremost among the extensions is PICSRules 1.1, a language definition for writing general profiles consisting of filtering rules which allow or block access to URLs, based on either PICS' labels or the URLs themselves. The idea is that these profiles can be made available on the Web by institutions and - using PICSRules-aware software - installed without configuration by the end user.
Its complement, DSig 1.0, is a specification for creating digitally signed statements about resources. It will be used with PICS so that others can verify the authenticity of a given label. James Miller, one of the authors of the PICS spec and former Domain Leader of Technology and Society at the W3C, said this specification was necessary because traditional digital signature implementations do not work for PICS labels. "There is nothing that applies to these labels," he said. "Digital signatures let you take a document, sign it and create a cryptographically-sealed envelope. You don't want to do that with these labels - you don't want the label inside the cryptographic wrapper."
While Miller said that the whole goal is for everybody to be their own rating service, with no central control of any kind, it remains to be seen just who will be the authoritative party providing the ratings.
"The [PICSRules protocol] makes sense only if one believes that there are in fact going to be a whole bunch of different ratings services out there," said Jonathan Weinberg, author of "Rating the Net" and an associate professor of law at Wayne State University. "If there is a service rating pages for educational content, and another one rating pages for violence, then it may be useful to have PICSRules in order to write a profile saying, 'block all violent pages (based on the output of service A) unless they're educational (based on the output of service B).' That's exactly what the folks at W3C have in mind. But I'm skeptical that all of these ratings services are in fact going to emerge - there doesn't seem to be a sensible business model that would support them."
As demonstrated at the Focus on Children Internet Online Summit recently, these two extensions will make it a breeze to view a filtered Web. The demo - using a modified form of the W3C-developed Web browser, Amaya - gave a peek as to what the near future holds for commercial products, such as those offered by IBM and Microsoft.
"We modified Amaya by adding a new data type for PICSRules files," said Miller. "The idea was when Amaya came across a PICSRules file, it would load it and ask you for a name of a child and a password - so that the child couldn't set the rules themselves - and then as soon as you've done that, it installs those rules. There's nothing else you need to do."
Describing a possible real-world scenario, Miller said that a parent might go to a Web site and find the "suggested parental controls." There might be a selection of 15 controls, each with a description and graphic telling you what would happen once they are installed. After reading those and deciding which would be acceptable, you would just click on it, type the name and password of the child you want it to apply to, and it would be done. "So it really becomes point, click, and it's set," Miller said. "Our goal is to get what the industry's now calling transparency."
Two of PICSRules' several authors already have PICS-enabled products: IBM's Internet Connection Secure Server and Microsoft's Internet Explorer browser, the latter of which has supported PICS for well over a year. But right now, the PICS implementation in Internet Explorer, for example, involves adjusting various settings in the program's options menu.
But the tradeoff to its ease-of-use, said Miller, is that the user has a lot less control of what is getting filtered.
Michael Froomkin, an associate professor at the University of Miami School of Law, agrees. "It's very user-friendly, but the devil is in the details," he said. "Specifically, there is at least a theoretical danger that choices about defining the policy choices may constrain the policies people develop. On first glance it seems as if the PICSRules is sufficiently general to avoid this problem, but I can't be sure."
More seriously, said Froomkin, is the plan to have third parties provide full-blown policy choices. Users would rely on the reputations of others - the Latter-Day Saints, the People's Republic, or the ACLU, for instance - to provide policies for them. "Users are empowered to choose who will make their choices, but the system is sufficiently complex to discourage the average person from making their own choices themselves," he said. "So you pick who will be your censor."
Another interesting aspect of these specifications is their possible use in what W3C director Tim Berners-Lee calls the creation of a "Web of trust." Miller explained that since the Web is a brand-new medium, one thing that must be established is "trust" - in other words, knowing what is being said by whom, and being able to verify that. PICSRules and DSig could be used to do this, and while he said that PICS was developed specifically for protecting children, a still-in-development richer system - at first called "PICS 2" but now known as Resource Description Framework - could be used for more general purposes.
While the W3C has been trying to encourage other uses for PICS-type labels, Weinberg said that PICSRules is a step in the other direction. "PICSRules is a protocol for content blocking, and nothing else," he said. "It thus seems to validate the image of PICS as simply a template for censorware, and it cuts against any image of PICS as a way of manipulating metadata for more wholesome purposes."
"This is a self-censorship model, with a little help from your friends at the organization of your choice," said Froomkin, who said that it is all about "not seeing things that disturb you and/or keeping your kids from seeing things that disturb you."