Uploaded by ronninfareed

R (on the application of Bridges) v Chief Constable of South Wales (1)

advertisement
R. (on the application of Bridges) v Chief Constable of..., 2019 WL 04179616...
For educational use only
R. (on the application of Bridges) v Chief Constable of South
Wales
Negative Judicial Consideration
Court
Divisional Court
Judgment Date
4 September 2019
Where Reported
[2019] EWHC 2341 (Admin)
[2020] 1 W.L.R. 672
[2020] 1 All E.R. 864
[2019] 9 WLUK 9
[2020] 1 Cr. App. R. 3
[2019] H.R.L.R. 16
[2019] A.C.D. 122
Times, December 09, 2019
Times, December 11, 2019
[2019] C.L.Y. 1389
Judgment
Subject
Human rights
Other related subjects
Police; Information technology; Criminal evidence
Keywords
Crime prevention; Data protection; Facial recognition software; Personal data; Privacy; Public sector equality duty; Right to respect for
private and family life; Visual identification
Judge
Haddon-Cave LJ;
Swift J
Counsel
For the claimant: Dan Squires QC, Aidan Wills.
For the defendant: Jeremy Johnson QC.
For the interested party: Richard O'Brien.
For the first intervener: Gerry Facenna QC, Eric Metcalfe.
For the second intervener: Andrew Sharland QC.
Solicitor
For the claimant: Liberty.
For the defendant: In-house solicitor.
For the interested party: Government Legal Department.
For the first intervener: In-house solicitor.
For the second intervener: Government Legal Department.
© 2023 Thomson Reuters.
1
R. (on the application of Bridges) v Chief Constable of..., 2019 WL 04179616...
Case Digest
Summary
The current legal regime in the UK was adequate to ensure the appropriate and non-arbitrary use of automated facial recognition
technology. A police force's use of such technology in a pilot scheme was consistent with the requirements of human rights and
data protection legislation and the public-sector equality duty.
Abstract
A civil liberties campaigner applied for judicial review of the defendant police force's use of automated facial recognition
technology (AFR).
Using AFR involved processing the facial biometric data of members of the public. The technology had been used by the police
force in a pilot project which involved deployment of surveillance cameras to capture digital images of people, which were then
processed and compared with images of persons on police watchlists. If no match was made, the facial biometrics or image of
scanned persons were immediately and automatically deleted. The instant proceedings concerned the adequacy of the current
legal framework in relation to AFR. The claimant challenged the lawfulness of its use generally, and specifically regarding two
occasions when his image had been captured without warning.
The claimant contended that the use of AFR interfered with his ECHR art.8(1) rights, was contrary to the requirements of data
protection legislation and failed to comply with the public-sector equality duty under the Equality Act 2010 s.149(1).
Held
Application refused.
ECHR art.8 rights - The claimant's physical proximity to the location of the cameras was sufficient to give rise to a reasonable
apprehension that his image might have been captured and processed such as to entitle him to claim a violation of his art.8(1)
rights, AXA General Insurance Ltd, Petitioners [2011] UKSC 46, [2012] 1 A.C. 868, [2011] 10 WLUK 287 followed. The
use of AFR did entail infringement of the art.8(1) rights of those in the claimant's position (see paras 45-62 of judgment). His
primary argument was that use of AFR was not "in accordance with the law" for the purposes of art.8(2). The police's common
law powers were "amply sufficient" in relation to the use of AFR; they did not need new express statutory powers, R. (on the
application of Catt) v Association of Chief Police Officers of England, Wales and Northern Ireland [2015] UKSC 9, [2015] A.C.
1065, [2015] 3 WLUK 106 followed (paras 68-78). There was also a clear and sufficient legal framework governing whether,
when and how it could be used, R. (on the application of Gillan) v Commissioner of Police of the Metropolis [2006] UKHL
12, [2006] 2 A.C. 307, [2006] 3 WLUK 246 followed. It was important to focus on the substance of the actions that use of
AFR entailed, not simply that it involved deployment by the police of an emerging technology. The fact that a technology was
new did not mean that it was outside the scope of existing regulation, or that it was always necessary to create a bespoke legal
framework for it. The legal framework within which AFR operated comprised three elements, in addition to the common law:
primary legislation, in the form of the data protection legislation; secondary legislative instruments in the form of codes of
practice issued under primary legislation, including the Surveillance Camera Code; and the police force's own local policies.
Each element provided legally enforceable standards. When considered collectively against the backdrop of the common law,
the use of AFR was sufficiently foreseeable and accessible for the purpose of the "in accordance with the law" standard. It
was neither necessary nor practical for legislation to define the precise circumstances in which AFR might be used. The data
protection principles provided sufficient regulatory control to avoid arbitrary interferences with art.8 rights. With regard to the
content of local policies, AFR was still in a trial period. The possibility, or likelihood, of improvement was not evidence of
present deficiency (paras 79-97). The use of AFR relied on by the claimant struck a fair balance and was not disproportionate;
it was deployed in an open and transparent way, with significant public engagement, Bank Mellat v HM Treasury [2013] UKSC
39, [2014] A.C. 700, [2013] 6 WLUK 527 followed (paras 98-108).
Data protection - The primary point of dispute in respect of the Data Protection Act 1998 s.4(4) was the extent to which using
AFR entailed processing personal data. The police argued that the only personal data processed was the data of those persons
on their watchlist. There were two routes by which the data could be considered "personal data": indirect identification and
individuation. However, the processing of the claimant's image was processing of his personal data only by the second route. The
information recorded individuated, or singled a person out and distinguished them, from others. The biometric facial data was
© 2023 Thomson Reuters.
2
R. (on the application of Bridges) v Chief Constable of..., 2019 WL 04179616...
qualitatively different and clearly comprised personal data because, per se, it permitted immediate identification of a person. The
police were therefore required to process that data consistently with the data protection principles. The claimant's case rested
only on the first principle that personal data had to be processed lawfully and fairly. The use of AFR on the occasions relied
on satisfied that condition. It was necessary for the police's legitimate interests taking account of the common law obligation
to prevent and detect crime (paras 109-127). Under the Data Protection Act 2018 s.34(3), competent authorities had to be able
to demonstrate compliance with Pt 3 Ch.2 of the Act. The claimant argued that AFR use entailed "sensitive processing" under
s.35(8)(b) and did not comply with the requirement of the first data protection principle under s.35(3). The police force agreed
in respect of those persons who were on a watchlist, but disputed that it extended to all those captured by CCTV cameras.
However, s.35(8)(b) could properly be read as applying to both categories of person. The operation of AFR did therefore involve
the sensitive processing of the biometric data of members of the public not on the watchlist. However, it also satisfied the first
two requirements in s.35(5) as being necessary for law enforcement purposes and in meeting at least one of the conditions in
Sch.8 (paras 128-137). The impact assessment prepared by the police met the requirements of s.64 of the 2018 Act. There was
a clear narrative that explained the proposed processing and referred to concerns raised in respect of intrusions into privacy
of members of the public. Although not part of the s.64 requirements, the assessment specifically considered the potential for
breach of art.8 rights. The claimant's criticism on that point was therefore without foundation (paras 142-148).
Public-sector equality duty - The police relied on a document entitled Equality Impact Assessment - Initial Assessment as
evidence of its compliance with the s.149(1) duty. The claimant's criticism was that the police had not considered the possibility
that AFR might produce results that were indirectly discriminatory on grounds of sex and/or race because it produced a higher
rate of false positive matches for female and/or black and minority ethnic faces. However, there was an air of unreality about that
contention. There was no suggestion that, when the AFR trial started, the police force had, or should have, recognised that the
software might operate in an indirectly discriminatory way. There was still no firm evidence suggesting indirect discrimination.
The possibility of future investigation did not make good the argument that, to date, the police force had failed to comply with
its duty. The police force had continued to review events against the relevant criteria. That was the approach required by the
public-sector equality duty in the context of a trial process (paras 149-158).
© 2023 Thomson Reuters.
3
Download