The ability of police officers to recognize and find people who have already committed crimes in the past is vital for their work. So much so that the police consider it fundamental to effectively maintain order on the streets, prevent crime and investigations. However, since 2010, the number of police officers [in Britain] has decreased by almost 20%, and the number of recorded crimes has been growing, so the police are turning to new technological solutions that should help increase their ability to track people who are concerned.
One of these technologies is automatic face recognition (ARL). This technology analyzes the main features of a person, creates its mathematical representation, and then compares it with a database of famous people to determine possible matches. And although many police officers in Britain and other countries are enthusiastically exploring the potential of ARL, some groups of citizens are discussing the legality and ethics of technology. They are concerned that it greatly expands the sphere of influence and the depth of the state’s surveillance of citizens.
To date, no reliable evidence has been gathered of what the ARL police can and cannot provide. Although more and more people are confronted with such systems, since they are used to check passports at airports, their use is well controlled there. Applying the same procedures to maintain order on the streets is much more difficult. People on the street move and do not look at the cameras. The lighting level is changing, and the system will have to cope with the vagaries of the British weather.
ARL in the real world
To imagine how the British police are using current ARL technology, last year we decided to evaluate the South Wales police project, which was designed to test the usefulness of ARL in everyday situations with which the police work. Since the 2017 UEFA Champions League Final, held in Cardiff, our team has monitored the use of this technology by the police and analyzed the data provided by the system. We wanted to form an understanding of how the police interact with the system and what it allowed them to do, as well as the difficulties that arise when using it.
South Wales police officers used ARL in two modes. Locate mode used live video from cameras located on police vans to search for recognized faces in a database among people under suspicion. Usually the database contained 600-800 photos.
The other mode, Identify, works differently. Images of unknown persons made at crime sites are compared with the database of arrested persons. This database contains approximately 450,000 images.
Based on the assessment of the system, it was concluded that ARF helps the police to recognize suspects so effectively, it would not be possible to do it in other ways. During the period of 12 months, while the investigations were going on, about 100 arrests and charges were made with the help of ARL.
But this system does not work automatically. The police had to adapt many standard operating procedures in order to work effectively with it. For example, after the discovery of a significant effect of the quality of photos on the work of the system, the police training program included training to work with the system so that in the future all photos would be better suited to work with it.
Auxiliary tool
Only after enough time had the police learned how to set up and use the system. In the process of testing the system has updated the algorithm of work, which has become more complex. And this improvement greatly affected the operation of the system. In the original version, which was introduced during the Champions League, only 3% of persons recognized by the system were considered accurate. But by March 2018, this percentage had already increased to 46%.
All of these innovative policing technologies are of legal and ethical concern that need to be considered. But in order for citizens, regulators, and legislators to deliberately discuss and evaluate them, we need to understand exactly what results can be expected from this technology. You need to get real evidence instead of referring to fantastic technologies like the one used in the movie “Minority Report”.
Considering all the above, one of the conclusions that can be drawn about the use of ARF in the police case is that it would be more correct to call this system “auxiliary facial recognition”, since it is not fully automatic. Unlike the border service, where facial recognition is closer to automatic, this algorithm, although it provides police support, does not make an independent decision on whether the image of a person coincides with what is stored in the database. Instead, the system gives the operator assumptions about possible matches, and only a live operator can confirm or deny them.
Hey there @save89, welcome to STEEM. If you join @schoolofminnows, you can receive votes for free.
1. Your post will appear in post-promotion on the discord.
2. Your posts will also get featured on the school of minnows account on steem
https://steemit.com/@schoolofminnows
3. You get votes from other members.
4. The whole thing is FREE.
To join follow this link:
https://steem.host/connect/steempunks
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit