Despite some of the general negatives associated with technologies, face recognition has its merits in many security applications.
Originally it was a “human-machine” technology that had very limited potential. Today, the evolution of this biometric identification technique has paved the way for full automation and universality. This happens when you block an electronic device by face, cross a border in some countries, or go through a modern employee attendance control system’s sensors.
Experts are busy taking biometric mechanisms to the next level, and they even have already achieved some success. Technologies started revolving with Instagram and other social media at first and we see the huge growth of technologies that can help investigations to finish the cases.
It is also a powerful tool for law enforcement to identify criminals. It is the China-based Skynet mass monitoring system’s backbone, with more than 600 million cameras installed across the country.
Even wearing a face mask may not prevent surveillance cameras from identifying you. Scientists have given these systems enough stimulus to identify individuals with partially covered faces. Identification accuracy by modern face recognition systems reaches 90% in cases where only half of the face is visible.
The coronavirus crisis has become the driving force behind this improvement. Since more people outside are wearing surgical masks and respirators, video surveillance systems need major overhauls to address this challenge.
The COVID-19 reality has motivated the Chinese tech giants, SenseTime and Minivision to deploy face recognition mechanisms in such scenarios commercially. The new algorithms not only have the ability to identify people with face masks, but they can also accurately identify those who have a scarf, sunglasses, hat, and even a fake beard.
The court finds some fault with the UK police force’s use of facial recognition tech
Civil rights activists in the UK have won a dispute with the South Wales Police (SWP) over the use of face recognition technology. It was the “first” victory in the world in the fight against the “repressive means of the repressive system,” as claimed by the human rights organization “Freedom.”
However, police have no plans to appeal the ruling – saying it continues to serve as a “cautious” use of technology.
SWP is testing Automated Face Recognized (AFR) technology since 2017, with lists of 400-800 people – including those wanted by warrants; Prisoners escaping from prison; People suspected of committing a crime; As well as people who may need protection, and Vulnerable people.
The consequences of human rights in the unconditional processing of sensitive personal data by the police are major issues. The unreasonable risks that may arise as a result of automating identity decisions is another critical issue.
According to the court ruling, the legal framework and policies used by the SWP did not guide clearly on where AFR Locate could be used and who could be placed on the watch list.
According to the court decision, the trial court had conducted a comparative inquiry into whether the use of AFR by the police was proportionate to human rights law when it considered the “real and expected benefits” of AFR Locate, placing AFR on bridges – and concluded that the benefits were potentially greater than the personal impact.