The Surveillance State Or The Safe State? The Rise Of Live Facial Recognition Technology
On 13 August 2021 the Home Office revised the “Surveillance Camera Code of Practice”. The purpose of the code is to provide guidance to local authorities and the police regarding the “appropriate use” of surveillance camera systems in England and Wales and to ensure such use is lawful.
The Code includes the regulation of live facial recognition technology (LFRT). This is defined by the Information Commissioner’s Office as “the process by which a person can be identified or otherwise recognised from a digital facial image”. Cameras are used to capture images of members of the public and facial recognition software creates a biometric template for each facial image captured. This is then compared against other biometric templates, often stored on a database, to identify a match, thereby verifying a person’s identity.
The civil liberties organisation Big Brother Watch state that police have used LFRT in South Wales, Leicestershire, Manchester, Sheffield, Hull and Liverpool at shopping centres, protests, music festivals, museums and football stadiums , including at the EUFA Champions League Final in 2017. The Metropolitan Police trialled the technology between 2016 and 2019. Most recently, the Commissioner of the Metropolitan Police was asked to confirm whether LFRT was used at Extinction Rebellion protests last month in London.
In the case of R (Bridges) v Chief Constable of South Wales Police, the Court considered South Wales Police’s use of live facial recognition technology to identify members of the public on a “watchlist”. The technology scanned and captured images of the public, comparing their biometric data, without their knowledge or consent, with images of those who were of interest to the police.
The Court ultimately found that South Wales Police’s use of LFRT was unlawful. Future use of live facial recognition will, however, be legally permissible, providing that certain safeguards are met:
- The data of persons which do not match those on the watchlist must be automatically and instantaneously deleted
- Clearer criteria should be given for who can be placed on a police watchlist and where the technology can be deployed
- It must be evidenced that the Public Sector Equality Duty has been fulfilled. A police force must demonstrate they have rigorously assessed whether a predictive policing model contains inherent biases, for example, whether the technology is more likely to falsely or mistakenly identify people from certain ethnic groups
There is no statutory framework specifically governing the use of live facial recognition technology. The Home Office state that the Code is consistent with recent legislation and case law, and will therefore ensure compliance with legal duties. The use of this technology, however, presents a number of legal issues, for example, who can lawfully be included on police watchlists and how extensively the technology can be deployed. More widely, there are questions surrounding the appropriateness of technology that relies upon a vast amount of sensitive personal data and whether such policing is disproportionately intrusive in modern society.
How does the Surveillance Camera Code of Practice govern the use of LFRT?
The Surveillance Camera Code of Practice sets out a number of considerations for the use of surveillance camera systems generally:
- Any use of surveillance cameras must be for a specified and legitimate purpose and “necessary to meet an identified pressing need”. Legitimate aims are, inter alia, the prevention of crime, national security and public safety
- The operator must consider how people’s rights to privacy will be affected by the intended operation of surveillance
- Members of the public should be notified that surveillance cameras are in operation, the reason the technology is in operation, and who is carrying out the surveillance
- Images captured should not be stored “for longer than is necessary to fulfil the purpose for which they were obtained in the first place”
- There must be accountability for the operation of the surveillance camera system, with persons appointed to ensure compliance with the standards set out in the Code. The use of the technology must also be reviewed periodically to ensure that continued use is lawful
The Code also gives specific guidance to the operation of live facial recognition technology:
- The use of LFRT must be “clearly justified, proportionate in meeting the stated purpose, and be suitably validated”. This may involve undertaking a Data Protection Impact Assessment periodically to ensure that the use of LFRT is justified, any privacy concerns are addressed and necessary safeguards are established
- Where the police have a “watchlist” of people they wish to find, the police should ensure the following:
- The categories of people to be included on the watchlist are published
- The criteria regarding where and when LFRT will be used are published
- The biometric data of individuals who are not a match with those on the watchlist must be instantaneously deleted
- A rigorous assessment of whether LFRT contains inherent bias must be carried out: as outlined, there may be a high risk of false identification on the grounds of race or sex
- Deployments should be authorised by senior officers and there should be clear criteria to the authorisation process
Limitations of the Surveillance Camera Code of Practice
The Code does not give any guidance as to who can be included on police watchlists. In Bridges, South Wales Police included people who were “simply of possible interest for intelligence purposes”. Such discretion was held to be “impermissibly wide”. The Metropolitan Police reportedly included those with “mental ill-health” on their watchlist, when deploying LFRT on Remembrance Sunday in London in 2017. In the absence of clear guidance, it is unclear how police and local authorities can make any meaningful assessment as to whether their use of LFRT will contravene Article 8 rights to privacy.
It is also unclear from the guidance when and where the technology can be deployed. In Bridges, South Wales Police deployed LFRT “in all events ranging from high volume music and sporting events to indoor arenas”. The Court indicated that the locations where LFRT can properly be deployed should be qualified and have an objective basis. Deployment should not be discretional and “without apparent limits”. The Code states that chief police officers should publish the criteria that will be used to determine “when and where to deploy LFR, having regard to the need only to do so for a lawful policing purpose”. The Code does not, however, give any guidance as to the criteria that should be adopted. It is therefore unclear on what basis deployment of LFRT will be permissible in a certain location.
The outlook for Live Facial Recognition Technology
The use of Live Facial Technology is arguably increasing: private companies are using the technology in shopping centres , transport hubs and conference centres. It was widely reported that the company Southern Co-operative have used LFRT in 18 of their supermarkets and the Metropolitan Police admitted last year to supplying images to a private company, to carry out live facial recognition surveillance at King’s Cross .
Hampshire Constabulary, Humberside Police, South Yorkshire Police and North Wales Police are currently trialling retrospective facial recognition technology, which compares video footage or images submitted by the public, from for example social media, with images obtained from custody records . Most recently, a company has incorporated live facial recognition into body worn cameras, to capture live images of the public, although such technology is not currently being used by the police .
Conclusions
Whilst the Home Office and the Information Commissioner Elizabeth Denham have conceptualised the debate surrounding LFRT as a “balance” between public protection and individual human rights , 31 civil liberties organisations have called for the use of the technology to be banned altogether . The continuing use of this technology raises a number of concerns: the potential for such technology to operate with racial and gender bias, how extensively and pervasively the technology can be deployed within society, and who can lawfully be “watched” by the police. In sum, a clear ethical position on whether intrusive technologies have a place in modern society is yet to be determined. The ethical parameters of using this technology are, likewise, yet to be clearly defined. Despite this fact, the prevalence of live facial recognition technology is increasing. What’s clear is that the Surveillance Camera Code of Practice, as drafted, will likely not be sufficient to govern its use.
Our leading team of Civil Liberties & Human Rights solicitors have been at the forefront of challenging unlawful actions by the State for over four decades. We passionately fight for what’s right and have an exceptional track record of holding authorities to account. If you would like to speak to one of our experienced solicitors, please call 0330 822 3451 or request a call online.