Biometrics and CCTV Commissioner’s Response to College of Policing Live Facial Recognition App

0

Whether it’s on our streets, our supermarkets, or (God forbid) our schools, how to handle live facial recognition (LFR) is the surveillance issue that won’t go away.

So I was pleased to see the College of Policing’s Authorized Professional Practice on Live Facial Recognition posting that sets out a commitment to the “lawful and ethical” use of this technology. Being guided by legal and ethical considerations will be essential if we are to face, for example, the horrifying prospect of state-owned surveillance companies supplying our police and schools with the facial recognition technology they use to to perpetuate genocide and human rights atrocities. in other parts of the world.

I do however have some concerns and questions regarding the published APP. For example:

  1. The apparent intent to use LFR technology to find “potential witnesses” is not the digital equivalent of placing a triangular sign in the street asking anyone passing by if they saw anything at a given date that he would like to share with the police. Typically, a police witness is someone who has indicated their willingness to participate in the criminal justice process – in which case you don’t need a camera to identify them for you; you already know who they are (and, if you don’t, why would you have a ‘library’ picture of them to compare to a crowd when looking for them?). If this is considering tracking people and approaching them to confirm if they were at a certain location on that date and then ‘prompting’ them to divulge what they heard and saw just because the system of surveillance of someone thinks they were present, it’s a new and somewhat sinister development that potentially treats everyone like extras on a police film set rather than individual citizens free to travel, meet and talk. I think the speculative use of the LFR in this way would call into question its legitimacy and proportionality. I can understand that there may be exceptional and very harmful events such as terrorist attacks or natural disasters where retrospective facial recognition could legitimately make a significant contribution to understanding what happened, but these events would be fortunately rare and quite exceptional. The effective consideration of exceptional events requires very careful wording so that the exception to the rule does not become a catch-all clause covering all unspecified eventualities.

  2. The terminology and definitions of different types of biometric and forensic research methods raise further questions. For example, LFR and Retrospective Facial Recognition invite questions about relevant training, certification and accreditation standards. What is the basic difference between an LFR search, a mass screening and a forensic database search? Will they need to be clarified with the new Forensic Science Regulator? It goes beyond a glossary and is important for the public’s understanding of APP and its wider implications.

  3. Representative test methodologies, e.g. the “Blue Watchlist”. A major and persistent challenge for British policing is the fact that ethnic minority populations continue to be under-represented in policing, in light of which the use of existing staff to test the LFR system already risks introducing a imbalance and an increased risk of demographic differentials, not only in software development, but also in the human arbitration process.

  4. LFR and counter-terrorism – although not specifically mentioned, the alignment between the LFR and the principles and standards set out in the UN Compendium needs to be clarified. Jean Charles de Menezes was tragically shot dead by CT police in London because he was misidentified by a surveillance officer. If we had to rely on LFR in these extreme circumstances in the future, what are the guarantees? Is there a case for judicial approval for the deployment of LFR rather than a senior police officer as there is for other types of surveillance? What about the exchange of LFR image models between jurisdictions, for example, where the technology is used for journeys through the Channel Tunnel? Perhaps the DCMS consultation on the oversight and regulatory structure of biometric surveillance should address this issue.

  5. The APP’s focus is on data rights, while the general focus of policing, coupled with acute public sensitivity to certain technologies, goes well beyond data protection. Rather than treating this area as simply a matter of respecting “data rights”, the framework for maintaining public trust in policing should focus more on the much broader impact on society. For example, the “chilling effect” of biometric surveillance by police has been well documented both in academic research and in the courts – if people decide not to travel, not to meet, not even to talk openly because they fear that where they go, what they do and say will be monitored by the police, this is a fundamental constitutional consequence of intrusive police activity; and it has nothing to do with data protection. Perhaps the DCMS consultation should address this as well.

In summary – moving from a standard police operational model of humans searching for other humans in a crowd to the automated industrialized process of LFR (as some have characterized it, a shift from angling to trawling in high seas), how commonplace will it become? to be arrested in our cities, transport hubs, outdoor arenas or schoolyards and forced to prove their identity? The ramifications for our constitutional freedoms in the future are profound. Is the status of British citizenship changing from our jealously guarded presumption of innocence to that of “suspect until we have proven our identity to the satisfaction of the examining officer”? If so, it will require more than an APP from the College of Policing: it will require parliamentary debate.

I want to continue an open and informed dialogue with stakeholders who have an interest in this area, from avid supporters to anti-surveillance activists and everyone in between. Technology’s own role in surveillance demands a balance, not only between what is possible and what is legal, but increasingly alongside what we find acceptable or even tolerable. Societal acceptability is the terrain on which the responsible, ethical and legitimate use of surveillance technology is shaped. Again, that certainly falls to Parliament.

To achieve a better understanding of the societal acceptability of facial recognition technology by police, my office plans to put “facial recognition to the test”. In collaboration with Professor William Webster (Centre for Information, Surveillance and Privacy Research), the event will contribute to a key objective of the Civil Engagement component of the National Surveillance Camera Strategy. The event will take place in front of a live audience and will mimic a trial with evidence provided by expert witnesses and members of the public acting as a jury. The mock trial will take place on June 14 at the London School of Economics with tickets available to book soon.

My website will continue to be updated as new details emerge.

Share.

Comments are closed.