I became your enemy because I tell you the truth
Wednesday, November 23, 2022 by: Roy Green
(Natural News) Spanish police are treading dangerous ground in their bid to make their jobs easier and faster. They intend to use an automated facial recognition (AFR) technology dubbed the ABIS, or automatic biometric identification system, in their pursuit of criminals.
Never mind that they are threatening to engulf the basic rights of citizens in the process.
Never mind that many quarters have opposed the ABIS, which uses artificial intelligence (AI) to identify suspects from a database that contains millions of images of crime suspects and detainees already on file in Spain. (Related: EU proposing legislation to restrict facial recognition tech and “high-risk” artificial intelligence applications.)
Never mind that a controversial case in the United States saw the wrongful arrest of a Michigan resident named Robert Williams based on a hit from an AFR software that turned out to be inefficient in identifying people of color.
Inaccurate AFR software leads to wrongful arrest
Due to the flawed information that he was involved in the theft of five watches worth around $4,000, Williams was arrested in the driveway of his house in front of his wife and children and confined in a crowded jail for 30 hours.
No wonder Williams gained national attention when he pressed charges against his captors and the entire state police, who later admitted that their software was inaccurate.
The ABIS already contains over five million images, and those arrested after the system is activated will be added to the database.
Developed by the French military technology company Thales, ABIS uses an algorithm called Cogent. The program compares images stored in the database to those introduced by the police, such as those obtained from a security camera or closed-circuit television. It has been approved by the National Institute of Standards and Technologies (NIST) in the United States.
To ward off allegations that the system intrudes on the lives of ordinary citizens and is prone to abuse, the Spanish police said they will not use it for surveillance and exclude images from civil database records, including those that can be accessed through official identity documents like the national identity document, driver’s license or passport.
The Spanish police added that they intend to share their data with other European Union (EU) member states under the Schengen Information System (SIS) to help in solving international crimes like human trafficking and terrorism.
“The Spanish ABIS system can connect with European databases, such as Eurodac, EU-Lisa or VIS, since the corresponding links are designed. It is not an isolated system, but rather it is interconnected with the countries of the European Union,” the Thales company explained.
Human rights group says AFR is discriminating and oppressive by design
Other EU members already using biometric recognition systems are Austria, Finland, France, Germany, Greece, Hungary, Italy, Latvia, Lithuania, Slovenia and the Netherlands. Just like Spain, Croatia, Cyprus, Czechia, Estonia, Portugal, Romania and Sweden are expected to follow suit.
EL PAÍ, a widely circulated daily newspaper based in Madrid, reported that the Spanish Agency for Data Protection (AEPD) has contacted Spain’s Ministry of the Interior “to address various projects that could have an impact on data protection.”
The AEPD wants to determine the risks the system poses to the rights and freedoms of citizens. Among the specific concerns raised were “how long the police will keep the images of suspects and who should have access to the data.”
AFR’s algorithm isn’t perfect yet. It commits mistakes, as proven in the Williams case. EU states like Belgium consider its use as “high risk” even though it has approved the use of AFR for “the purposes of preventing, arresting or investigating serious crimes or terrorism.”
The Liberty Human Rights group in the U.K. wants to ban the use of AFR altogether, citing that it is discriminating, invades privacy, undermines freedom of expression and is oppressive by design.
According to Carmela Troncoso, a professor at the Federal Polytechnic School of Lausanne in Switzerland, there are questions regarding facial recognition technology that must be answered. “NIST does not say that algorithms are good or bad. And in addition, the organization proposes several evaluations with different objectives, and we do not know which ones they refer to,” she said.
Eticas Consulting, a company that specializes in auditing algorithms, echoes Troncoso’s views: “In accordance with European regulations, the proportionality of high-risk technologies must be justified and what is expected to be achieved with them must be established. It is also necessary to know what precautions have been taken to avoid algorithmic biases. It is proven that these systems identify white people better than the rest, so you have to prove that they do not make mistakes with blacks.”
Michael Loyman