WASHINGTON -- In an attempt to allay growing public concern, defenders of face-recognition software for video cameras gathered in the nation's capital on Wednesday.
The stated purpose of the press conference: "To stop irresponsible grandstanding and fear-mongering, and to start an open and honest dialogue on the shaping of policies, which will ensure responsible use," in the words of the director of the Security Industry Association, Richard Chace.
The effect: not much.
Primed by hundreds of news articles chronicling the outcry over Tampa, Florida's decision to install facecams, the score of journalists who showed up weren't in the mood to be persuaded of how benign the technology was.
One reporter asked if police might use face-recognition software to track political dissidents. Another questioner wondered if the SIA, a trade association with over 350 members, had a position on companies that give the technology to police agencies in return for a percentage of fines collected. (The answers: yes and no.)
When a nettled Visionics chief executive Joseph Atick was asked if his company's software was being used in countries that consider academic dissidents to be criminals, he shot back: "Do you consider Great Britain to be one of those countries?"
That was a reference to Newham, England's decision in 1999 to install Visionics' FaceIt face-recognition system -- the same software that Tampa police have adopted. With 300 cameras now in place, Atick said, crime in that area of the city has dropped 40 percent since before the facecams were installed.
Atick and the other speakers signaled a strong willingess to work with Congress and other legislators. They said, for instance, that it was reasonable to post signs saying that face matches were taking place, that recorded faces not matching ones in the databases should be deleted, and that "technical and physical safeguards" should be in place.
But on the point that matters to privacy advocates -- that is, whether the facecams should be turned on at all -- they weren't about to budge.
"Behavior and activity exhibited in a public area is obviously available for observation by others," said SIA's Chase. "Police observation of activities conducted in plain view in a public place, therefore, does not violate the Fourth Amendment guarantee against unreasonable search and seizure."
Chase added, for good measure: "This technology is about public safety and life safety. It is an invaluable tool for law enforcement to ensure we have safe communities in which we raise our families."
Notably absent from the presentations was a reference to a report in Wednesday's edition of the St. Petersburg Times. The newspaper article said that Tampa police used an image of Rob Milliron, a local construction worker caught on camera and not wanted for any crime, to demonstrate the facecam system to reporters. After U.S. News and World Report published the photograph, a woman erroneously reported Milliron as wanted for child support, and police showed up at his job.
Police use of software similar to that marked by Visionics at Super Bowl XXXV in Tampa prompted public outcry, but city officials moved forward with plans to station cameras outfitted with the software on city streets. Faces are matched on a scale of 1 to 10, with police computers emitting an audible "whoop whoop!" alarm for scores of 8.5 and above.
Privacy advocates say this practice, which is growing in popularity among law enforcement, is set to become the greatest privacy threat of the next decade.
As cameras become ubiquitous, as face-recognition technology becomes more accurate, and as databases of known faces grow, the argument goes, everyone from direct marketers to the FBI will be able to track your movements and compile detailed dossiers on your life. Could photographs from driver's license databases be included in the database, allowing police to track your movements all the time?
Police already swear by this technology.
Thomas Seamon, president of the International Association of Chiefs of Police, appeared at the press conference to downplay public concern and tout facecams as merely a close cousin of closed-circuit television.
Said Seamon: "CCTV and facial recognition technology could prove to have definite applications in recognizing criminals with active arrest warrents, recidivists who repeatedly rob and burglarize areas, in locating lost children and others, and in protecting women by helping to enforce court-ordered stay away and protection orders."
Andrew Osterman contributed to this report.