In the 1997 flick Face/Off, John Travolta wears Nicolas Cage's face in order to infiltrate Cage's criminal gang. Last week, movie fantasy became medical reality when a team of French doctors successfully performed the first partial face transplant on a woman who had been mauled by a dog.
It's a timely development, because new facial-recognition technology is moving us toward surveillance that is unnoticeable, distributed, persistent, searchable and cheap. And as the technology's effectiveness improves, a new face may be the only way to preserve some semblance of privacy.
A great example of the state of the art comes from a new company called Riya, which recently launched a beta facial-recognition service for the masses.
The service builds on current multimedia search techniques. With billions of bits of information out there, finding what you want is impossible without good search tools, and there hasn't been a really good way to search multimedia files like photos, video and sound. A common current practice, used by photo sites like Flickr, is to encourage users to tag photos, and then allow text searches of those tags.
Riya also relies on meta tags, but uses facial-recognition software to create them automatically. Subscribers upload photos, and then tell the Riya software who the person is. By repeatedly running the recognition algorithm against multiple photos of the same person, Riya software eventually learns to identify other images of the same face. Once trained, the software will automatically generate meta tags, and users can search their own photos and the photos of other subscribers.
The service currently only searches photos uploaded to its servers. The technology could, however, be deployed across the internet, allowing people to search the web, Flickr, Tribe and Friendster photo sets, regardless of whether the owner or the person photographed wants to be identified. That's where things get interesting.
Mothers could search MySpace.com and find pictures of their children at a party when they were supposed to be studying at a friend's house. Insurers could search and find a photo of a customer bungee-jumping, and raise the daredevil's premiums. I predict that the tool will be invaluable to former (and future) boyfriends and girlfriends checking up on lovers.
In the analog days, when you left your house, there was always a possibility that you might run into someone who would remember what you were doing, and tell anyone who cared enough to ask. In a digital world, you do not know if someone is taking your picture -- with a camera, a webcam or a cell phone -- and the image can be stored forever and searched by people you do not know, at any point in time, without your knowledge and at little or no cost to the searcher.
Even in public, we used to enjoy some privacy, if only in our anonymity. Facial-recognition technology is one reason that's increasingly less true.
It's worth noting that, as good as the technology is, facial recognition alone isn't good enough to identify terrorists. An independent review of products available to the government concluded that current technology cannot be relied on to pick an individual out of a crowd.
Even 99.9-percent perfect technology is statistically useless for identifying terrorists. If one in 10 million fliers, on average, is a terrorist, the software will generate 1,000 false alarms for every one real evildoer. "It's The Boy Who Cried Wolf increased 1,000-fold," security expert Bruce Schneier has explained.
But Riya is banking on the software being good enough to pick out your grandmother in your holiday photos. Also, a mistake here isn't as costly as misidentifying a terrorist, so the software has lots of chances to learn your face and pick it out at the local gay bar, fast-food joint or political protest.
Riya has a tart answer to user privacy concerns. If you do not want to be indexed, do not let anyone post any photos of you. This is easier said than done. Also, the response falsely assumes that being seen has the same privacy implications as being identified. It does not.
If someone posts a photo that includes me, people who see the photo will see my face, but they almost certainly will have no idea who I am. If the photo is tagged with my name, a stranger who likes the way I look can then find out more about me. A prospective employer looking for more information about me can search my name and find the picture.
The privacy principles that made sense in the analog world do not make sense in a digital one. Historical assumptions about the relationship between being seen and being tracked have expired. We have to face these facts, or we'll miss the bigger picture.
Transplant surgery is too extreme for most of us. But cheap, ubiquitous and persistent searchability will forever change what it means to have your picture taken. Our reaction as citizens and as policy makers has to be balanced as well. Search is great. It's being found that's the problem.
- - -
Jennifer Granick is executive director of the Stanford Law School Center for Internet and Society, and teaches the Cyberlaw Clinic.