“What is not, what cannot be
In the 2002 film Minority Report, starring Tom Cruise and directed by Steve Spielberg, based on a story by the ubiquitous fictional Philip Dick, police arrest criminals before committing crimes based on information from three psychics called Precogs. The story is from the year 2054.
We are entering 2022.
digital precogs
Artificial intelligence programs were being used in the US to define sentences and assess possible repetitions. In addition to the serious aspect, in my opinion, of such systems not being open to their codes for perceived reasons of commercial secrecy, there was also the observation of racist tendencies, with more frequent errors against ethnic minorities.
Maybe the program works fine. Maybe not. The fact is that with the escalation of controversies and the number of possible errors, there was a brake on its use.
The NYPD, on the other hand, was using a computer program to identify the pattern. In this case, there is only data analysis without prediction, which becomes arguably more agile if not absolutely more efficient than human action. Which is quite convenient in criminal investigation.
facial recognition
Another controversy, which is still in police territory, is the use of facial recognition to identify alleged criminals. Not to mention the bizarre attitude of the NYPD, which entered actor Woody Harrelson’s photo into the system to identify a beer thief because the suspect looked like him, there are other controversies.
In both the US and UK, studies have pointed to a certain racist bias in algorithms for identifying suspects identified by facial recognition programs. Mistakes are made more often against blacks, and even more mistakes are made against black women.
An interesting documentary about these racist and sexist biases, in general, is Netflix’s Coded Bias, which takes as a starting point the research of scientist Joy Buolamini from MIT.
judge and prosecutor
Artificial intelligence is growing in all fields, as can be clearly seen. And strong steps are being taken to bring him directly into the judiciary and public prosecutors.
Estonia has been developing a robotic judge for the past few years to arbitrate small claims. In Brazil, the STF uses VICTOR, a program specifically used to expedite the identification of issues of general repercussions of more incidents. In states, courts are also developing their own robots to expedite the screening of certain cases.
Now prosecutors are also being “robotized” in China. The government is developing a program with artificial intelligence that, with only verbal information, would be able to analyze whether a case is proceeding from a criminal point of view.
It is clear that we are increasingly in the hands of technology. Whether it will be good or bad, only the future will tell.