If you’re looking for a firm that is capable of handling any type of criminal matter, contact Kammen & Moudy, LLC.
We have more than 50 years of experience handling a wide variety of criminal cases.

  1. Home
  2.  » 
  3. Firm News
  4.  » FBI information gathering

FBI information gathering

On Behalf of | Dec 18, 2019 | Firm News |

Many science fiction movies offer visions of dystopian futures. The scariest ones often look calm and happy on the outside. That appearance is deceiving, as the people in those futures are often completely controlled and constantly monitored.

Even scarier? That future might already be here. According to New York magazine, the United States is deeply invested in facial recognition technologies that can identify us with 99.8% accuracy. If your face is in a database, the cameras know who you are. So do the agencies behind them. And there are legal concerns tied to this form of information gathering and storage.

What is facial recognition software?

Facial recognition software uses algorithms to study photographs of people’s faces. Then it compares them against databanks of photos that are tied to names and other information. If the program decides the two photographs look similar enough, it signals a match. Whoever is using the software suddenly has a name for the photo, and he or she often has a lot more information than that, too.

Who uses facial recognition software?

The short answer is “everyone.” Or, at least, far more people and businesses than you might suspect. Some of the examples the New York magazine listed included:

  • Facebook
  • Amazon
  • Google
  • The NYPD, FBI, and other enforcement agencies
  • Schools
  • Retailers (often to ID shoplifters)
  • Churches
  • Skate parks
  • Individuals

One of the individual users the magazine noted is singer Taylor Swift who uses the technology to scan her crowds for known stalkers. In short, facial recognition software isn’t coming soon; it’s already here.

What are some of the problems?

There are problems with both how well the technology works and with the ways it is used.

In general, the technology works very well. When the National Institute of Standards and Technology tested 127 programs from 40 different makers, it found the best ones got matches wrong only 0.2% of the time. But there are caveats:

  • The software struggles with twins
  • It can struggle with race
  • A 0.2% failure rate can still be a big deal

This last point is especially important when the technology is used on a large scale or asked to work with poor-quality images. One algorithm that boasts a 0.1% failure rate drew attention in London when it mismatched 34 of 42 suspects.

And then there are the problems with the ways police and other people have used the technology:

  • The scientists say the technology is reliable when the threshold of doubt is raised higher, but police have sometimes lowered the threshold
  • They asked it to match sketches rather than photos
  • They used it to cheat standard procedures for photo lineups

As an example of this last point, the New York magazine wrote about how the NYPD pulled security footage from a store. They ran it through facial recognition software and got a match from their database. They showed the photo to a security guard and asked him if he recognized the guy they found through facial recognition. Of course, the guard said he did, even though the man in question had been in the hospital with his wife as she gave birth to their baby.

How to avoid the worst abuses

The article noted that China has already begun to use public cameras and facial recognition software to score people’s civic behavior. That’s a situation that may strike those of us in the United States as dystopian science fiction. But it’s happening, even though it’s not likely to happen here.

However, there are plenty of other ways the technology is used here, and every new use is a chance for abuse. People have already raised privacy concerns. There have been instances where someone was wrongly accused of a crime after a false match.

At Kammen & Moudy, LLC, our attorneys recognize how new technologies can be misused. When those misuses lead to wrongful arrests or accusations, our clients can rely on our tenacious defense. We question the technologies and the ways they were used, and if we find a reason to doubt their results, we make sure any possible jury will hear about it.

FindLaw Network