Information Commissioner Office (ICO) issues “damning report” on the use of facial recognition software by police.

Information_commisioner_offce_logoOn October 31st 2019 the Information Commissioner Office (ICO) has released a report following an investigation into the use of live facial recognition software by police forces in the UK. While the investigation focused the case of the New South Wales, the report acknowledges the implications that it will have on police forces around the country. Specific reference was made to the Metropolitan Police Service.

This investigation comes in a time of public interest with regards to live facial recognition (LFR), particularly in the wake of the King’s Cross Station incident. There has been one court case against a police force so far over the technology’s use, with 18 politicians signing a petition to prevent futher implementation and pilot programs being halted around the country.

“Police forces across the country halting facial recognition trials due to public backlash is a huge step backwards and puts innovation at risk,” said biometric authentication firm Veridium’s CRO, Jason Tooley.

While the report did not conclude that regulatory action was necessary in the case, it did highlight some key issues:

“There are areas of data protection compliance where the Metropolitan Police Service and South West Police could improve practices, share lessons and reduce inconsistency.”

“There have been missed opportunities to achieve higher standards of compliance and also to improve public awareness and confidence in the technology and its use for law enforcement purposes.”

“Inconsistencies in approach between SWP and the MPS are likely to be repeated in any roll out of LFR across more forces, leading to an increased risk of compliance failure and undermining public confidence.”

“In particular, where this inconsistency relates to the compilation of watchlists and to individual forces’ necessity and proportionality judgements, it is likely to lead to more confusion and deeper public concern and make the law less predictable and foreseeable.”

“The absence of a statutory code of practice and national guidelines contributes to inconsistent practice, increases the risk of 37 compliance failures and undermines confidence in the use of the technology.”

“Data protection legislation has specifically set a high bar for processing biometric data. The Commissioner remains of the view that the more generic the objectives and the watchlist, the more likely it is that the bar will not be met. The MPS deployments were overall more specific than that of SWP.”

“Despite over 50 deployments, in the case of SWP, there is no clear articulation of what the police consider to be ‘effective’ or at what point the piloting phase may end. This could lead to concerns overall about effectiveness and therefore whether the high number of trials over an extended period supports or undermines the necessity and proportionality case for its use.”

“Whilst there is a reduction in the number of false matches since 2017, more needs to be done to reduce technology bias and to describe the steps taken by the police to do so.”

“The investigation did not identify whether staff that were involved in compiling the watchlists had guidance on how to ensure that all the images they used to compile the watchlists were accurate and lawfully retained.”

“DPOs have been too peripheral in the LFR pilots and in some instances have been consulted too late in the process. This leads to concerns about forces’ adherence to data protection accountability principles.”

“Fair processing obligations were broadly met but with room to improve public awareness of the deployment of LFR through better positioned and clearer signage and through use of police forces websites.”