How A.I. is Affecting the Criminal Justice System

How A.I. is Affecting the Criminal Justice System

How A.I. is Affecting the Criminal Justice System

Artificial intelligence in the form of facial recognition is becoming further embedded in society as technology continues to advance. Apple’s iPhone X uses facial recognition as a password to unlock it, shopping centers use it for advertisements, and Facebook has long used it to identify users’ faces in photos. Technology companies like IBM and Microsoft take artificial intelligence a step further with their developments of emotional recognition algorithms. The idea behind this new technology is to identify the emotions of people based on their movements and facial expressions. 

While the use of artificial intelligence by these companies may seem harmless, law enforcement and the criminal justice system have also stepped up their utilization of artificial intelligence. What’s particularly unsettling is the inaccurate and unfair way courtrooms across the country are employing it, which raises concerns about privacy and potential bias due to the heavy use of A.I. by the criminal justice system.

How Does the Criminal Justice System Use A.I.?

Courtrooms across the country are beginning to implement “risk assessments” based on algorithms to identify an individual with a high risk of being re-arrested or not showing up for their court dates. Studies have shown how these algorithms tend to be unreliable based on defective data.

Police stations can now scan and make a note of a passerby’s faceprint in real-time using surveillance cameras. This information goes into a database that stores hundreds of thousands of citizens’ facial features. While this technology is increasingly being used, faceprints can’t compare to the accuracy of traditional fingerprints when it comes to identifying wanted criminals.

More than local law enforcement, federal entities are using A.I. through facial recognition now more than ever. Over the course of four years, the FBI administered more than 118,000 searches using its facial recognition database – a fact the Governmental Accountability Office discovered. Customs Enforcement and U.S. Immigration have also stepped up their efforts using facial recognition. To locate undocumented immigrants, they’ve started to mine the databases of DMV’s in many parts of the country to see if driver’s licenses were given to immigrants who may be in this country illegally.

How Does Facial Recognition Work?

Facial recognition technology uses computers to take a blueprint of the dimensions and distances between the features of your face – something that’s called a faceprint. It measures the distance between your chin and lips along with the gap between your eyes. The technology then charts those features onto images of you that already exist in a giant database. The largest public database of facial features is Microsoft Celeb, which houses more than 10 million images of almost 110,000 people from all over the world. Many of these images are collected off social media. When people upload photos of themselves, they’re consenting to their photos’ public release, whether they know it or not.

How Accurate is Facial Recognition?

At this point, facial recognition isn’t accurate very often. For example, Amazon created software that misidentified 28 members of Congress and confused them for criminal mugshots. The software also did a poor job of correctly identifying women and people of color. While many assume that machines are unbiased, it’s important to remember that people must program the computers. People have biases, and when they program a machine, that machine is also going to adopt those biases.

What Privacy Rights Can A.I. Violate?

The Fourth Amendment to the Constitution guarantees U.S. citizens the right to privacy against unreasonable searches and seizures. With so many government agencies and police departments nationwide using artificial intelligence through facial recognition, it raises the question of whether this is a violation of our privacy rights. Because the technology is still relatively new, there’s no national standard on how far companies and agencies can use A.I. Some states, like Illinois and Washington, have developed laws to limit the ability of private companies to use facial recognition. But at this point, there’s no limit on how the federal entities can utilize A.I.

Contact The Umansky Law Firm if Your Right to Privacy Has Been Violated  

As U.S. citizens, we have a constitutional right to privacy. If you believe that your privacy rights have been violated to capture evidence and hold it against you, it’s essential that you contact an experienced criminal defense attorney as soon as possible. 

The Umansky Law Firm emphasizes the importance of knowing your rights, and we’ll aggressively fight for those whose rights have been violated. 

Our criminal lawyers have more than 100 years of combined experience in criminal law. Call Umansky and his team to defend you against illegal law enforcement practices, such as unlawful search and seizure. Call our office or complete an online contact form for a free consultation.

How A.I. is Affecting the Criminal Justice System