Starr: Facial Recognition Tech: No Longer Science Fiction, but a Threat to Privacy
October 16, 2019
Two weeks ago, the Utah state Legislature debated the use of facial recognition technology by law enforcement agencies. Legislators heard comments from representatives of interest groups, the public safety department and an associate from Georgetown Law’s Center on Privacy & Technology. The use of facial recognition software introduces serious concerns about accuracy, the right to privacy and justice. If we hold these values dear, the Legislature should halt the use of the technology immediately until significant improvements are made to its operation.
Facial recognition technology consists of computer programs which can make out human faces. The technology sorts out the facial information within a database, analyzing measurements of physical features — also known as biometrics — from photographs or videos. It attempts to find a match and identify people by comparing an input frame of a human face with faces already contained in the system, many of which were obtained from driver’s license files.
This type of technology is different than iris recognition systems and fingerprinting, which are also forms of biometrics used to identify people. Despite its growing use, the accuracy of facial recognition is significantly lower than these other two types of identifiers, and consent for use of personal information is an emerging problem with the tech.
Facial recognition does not require consent in the same way that iris recognition and fingerprinting does. You are actively involved when your fingertips touch ink and make contact with the paper, and you knowingly participate in this process. For the most part, a similar story is told for iris recognition — typically, one is a few inches away from a scanner that isolates the pupil and iris, analyzes the features and stores the data. By contrast, photos and videos can be captured from large distances without consent or knowledge of the subject. This makes the process inconspicuous and haunting because you have no ability to opt-in for a photo of your face to be stored and used by law enforcement. Your face that was captured on the way to your vehicle or from walking around downtown is being seized without your knowledge, and the frame of your face is taken without your consent. Data of your personal features should only be obtained and warehoused with both your awareness and permission.
Another huge departure from the more consensual systems that capture biometrics has little do with you opting into the process and everything to do with accuracy. When comparing facial recognition to fingerprints or iris recognition, the rates of precision take a nosedive. The technology’s exactness is easily hindered by lighting and other objects interfering with the frame such hair or other people. Many systems are also impacted by low-resolution images and a variety of facial expressions.
The technology does not always create a 1:1 match when analyzing the facial data. This leaves the job of finding a match amongst a bunch of potential options to humans rather than an algorithm. Relying on a pair of human eyes to select a match from large batches of photos is not only unrealistic, subjective and sometimes incorrect, but it also reserves a seat for racial and gender profiling at the table of an already highly discriminatory public safety system. Is it really the best move for public safety departments to be using systems that can enhance the risk of bias, which impacts communities of color and women at disproportionate rates?
Not only do we insert the possibility of discrimination when we leave the matching process up to people after technology spits out options, but we also introduce race and gender bias within the technology itself, which may lead to misidentification and larger consequences like lifelong imprisonment. Recent studies show that classification algorithms used by artificial intelligence, including facial recognition technology, performed best for individuals with lighter skin tones and males. The tech performed the worst for women of color, who are already more likely to face bias and violence during interactions with law enforcement. The potential for error is unacceptable, and a moratorium on current use by public safety departments should be no-brainer for the Legislature.
This technology is easy to apply given how many surveillance systems we already in place, and because the capturing of frames is discrete, it’s obvious that public safety departments would want to use it. The ease, however, doesn’t justify use without seriously considering issues of privacy, inequity and accuracy. Fingerprinting folks isn’t as convenient as facial recognition technology, but this new method of identification does not align with our values of privacy that have been distinctly codified in the Constitution.
As of today, only two cities in the United States have banned the use of facial recognition technology by local government and police. The Utah Legislature should follow the cautionary acts of San Francisco and Somerville, Massachusetts and discontinue use. The Fourth Amendment, standards of precision and the value of justice should all be championed in the face of easy, discreet technology that doesn’t truly serve communities. Facial recognition systems should be halted at the very least until the problems around its application are remedied.