One of the byproducts of the rapidly increasing power of microchips is the proliferation of chip-powered cameras and devices that are constantly watching and, in some cases, recording the world in front of them in high definition. There are cameras in doorbells and ATMs, overlooking building entrances and parking lots, on traffic lights and retail store walls.

The Washington County (Oregon) Sheriff’s Office, above, was the first law-enforcement agency in the country to use Amazon’s artificial-intelligence tool Rekognition. San Francisco recently became the first city to ban police and city agencies from using facial recognition software. The Washington Post/Jhaan Elker

Law enforcement agencies have taken advantage of these electronic eyes to gather evidence of crimes, trace suspects and search for missing persons. Now, police around the country are embracing a technology that can turn recordings into results far more efficiently: facial recognition software. The software matches images from a database of pictures – for example, driver’s licenses or mug shots – against what a security camera has recorded to try to identify the people in the recordings.

One problem, though, is that the software isn’t 100 percent accurate. Some of the versions on the market have an especially poor track record when it comes to identifying darker-skinned people. For example, a recent study found that Amazon’s Rekognition misidentified women’s images as men’s 19 percent to 31 percent of the time.

The San Francisco Board of Supervisors seized upon this flaw in an ordinance adopted Tuesday. It requires government agencies in the city to develop policies for surveillance technologies that govern their use, and then to obtain the board’s approval before acquiring and deploying such tools. The one exception was facial recognition software, which the ordinance flatly prohibits government agencies from using.

“The propensity for facial recognition technology to endanger civil rights and civil liberties substantially outweighs its purported benefits, and the technology will exacerbate racial injustice and threaten our ability to live free of continuous government monitoring,” the ordinance states.

The board was wise to impose some badly needed transparency and oversight on local law enforcement’s use of monitoring and tracking technologies, such as license plate readers and cellphone locators. Police agencies are focused on fighting crime and are often too cavalier about preserving civil liberties; elected officials are the ones who should be deciding how much of their constituents’ freedom from surveillance to sacrifice in order to make them more secure.

Advertisement

That should have been the approach taken to facial recognition software as well. Although it’s frightening to think how this technology could be misused – China’s Orwellian monitoring of its population, and in particular its persecution of the Uighur Muslim minority, is Exhibit A – the technology itself isn’t evil. In fact, it can be a lifesaver. And new applications of the software are emerging all the time.

Even if we don’t think the technology is good enough yet for police to use in identifying suspects, we may welcome its use by police in search-and-rescue operations, finding missing youths who’ve been victimized by sex traffickers or providing real-time security at major public events. Meanwhile, the technology is steadily improving, and in some cases can do a better job at identifications than humans do.

Local governments need to approach all surveillance tools carefully, setting standards for how well the technology must perform and policies to govern how the tools can be used. In the case of facial recognition, a number of serious questions remain to be answered before law enforcement agencies are given the green light. What image sources are reliable enough to be used for identifications – driver’s licenses? Mug shots? Can people be added to the database of images without their knowledge or consent? How should the software be tested? How much detail do agencies need to release to the public about their use of the technology?

Some applications are so intrusive – such as using cameras with facial recognition abilities to track a person’s movements from camera to camera – that they shouldn’t be available to police without a warrant. There should also be safeguards to ensure that law enforcement agencies don’t circumvent the rules by obtaining information from cameras and facial recognition software deployed by private businesses.

In other words, this powerful technology requires oversight and caution to prevent it from being abused. But a ban would throw the good uses out with the bad ones.

 


Only subscribers are eligible to post comments. Please subscribe or login first for digital access. Here’s why.

Use the form below to reset your password. When you've submitted your account email, we will send an email with a reset code.

filed under: