Strengthening Legal and Equality Oversight on Police's Automated Facial Recognition: Bridges v South Wales Police
Introduction
The case of Bridges, R (On the Application Of) v. South Wales Police ([2020] EWCA Civ 1058) represents a pivotal moment in the intersection of law enforcement technology, privacy rights, and equality obligations within the United Kingdom. Edward Bridges, a civil liberties advocate, challenged the South Wales Police Force's (SWP) use of live Automated Facial Recognition (AFR) technology, specifically the AFR Locate system, on the grounds that its deployment infringed upon his rights under Article 8 of the European Convention on Human Rights (ECHR), violated data protection laws, and breached the Public Sector Equality Duty (PSED) as stipulated in the Equality Act 2010.
This commentary delves into the comprehensive judgment rendered by the England and Wales Court of Appeal, examining the legal reasoning, precedents cited, and the broader implications for future deployments of AFR technology by law enforcement agencies.
Summary of the Judgment
The Court of Appeal addressed five grounds of appeal raised by the Appellant, Mr. Bridges. The pivotal findings are as follows:
- Grounds 1, 3, and 5: The Court allowed the appeal, concluding that SWP's use of AFR Locate was not in accordance with the law under Article 8(2) of the ECHR, that their Data Protection Impact Assessment (DPIA) failed to comply with section 64(3)(b) and (c) of the Data Protection Act 2018 (DPA 2018), and that SWP did not fulfill its obligations under the PSED.
- Grounds 2 and 4: These grounds were dismissed. The Court found that the Divisional Court had appropriately addressed the proportionality of the interference with Article 8 rights and that SWP's policy documents, while lacking in certain areas, were subject to future revisions in light of guidance from the Information Commissioner.
Consequently, the Court issued a declaration highlighting that SWP's deployment of AFR Locate was unlawful under the specified grounds.
Analysis
Precedents Cited
The judgment extensively referenced landmark cases that shape the understanding of surveillance technology's legality and its alignment with human rights:
- R (Catt) v Association of Chief Police Officers [2015] UKSC 9: Established principles regarding the "in accordance with the law" standard for Article 8(2).
- Bank Mellat v Her Majesty's Treasury (No 2) [2013] UKSC 39: Outlined the four-part proportionality test for justifying interference with Convention rights.
- R (P) v Secretary of State for Justice and Another (In re Gallagher) [2019] UKSC 3: Further clarified the requirements for lawful interference under Article 8.
- R (Bracking) v Secretary of State for Work and Pensions [2013] EWCA Civ 1345: Provided clarity on the Public Sector Equality Duty.
These cases collectively informed the Court of Appeal's approach to evaluating SWP's AFR deployment, ensuring that the new technology's use was scrutinized within established legal frameworks.
Legal Reasoning
The Court of Appeal's reasoning hinged on three core areas:
- Lawfulness under Article 8(2): The Court evaluated whether SWP's use of AFR Locate was "in accordance with the law." It determined that the existing legal framework, comprising the Data Protection Act 2018, the Surveillance Camera Code of Practice, and SWP's local policies, was insufficient. The discretion granted to SWP in determining watchlist criteria and deployment locations lacked the necessary specificity to prevent arbitrary interference with individuals' privacy rights.
- Data Protection Impact Assessment (DPIA): The Court found that SWP's DPIA failed to adequately assess and mitigate risks associated with processing biometric data, especially concerning individuals not on watchlists. This oversight violated sections 64(3)(b) and (c) of the DPA 2018.
- Public Sector Equality Duty (PSED): SWP's Equality Impact Assessment was deemed inadequate as it did not sufficiently consider the potential for indirect discrimination based on race or gender. The Court emphasized that the PSED is a continuous obligation requiring thorough and proactive measures to prevent discrimination, which SWP failed to uphold.
The Court underscored that sensitive processing of biometric data demands stringent safeguards and transparent policies to ensure compliance with legal standards and equality obligations. The arbitrary discretion in SWP's policies posed significant risks of disproportionality and indirect discrimination.
Impact
This judgment has profound implications for the deployment of AFR technology by law enforcement agencies:
- Legal Framework Enhancement: Police forces must establish clear, specific criteria for watchlist inclusion and AFR deployment locations to ensure the legality of such interventions under Article 8.
- DPIA Compliance: Comprehensive DPIAs are mandatory, requiring thorough risk assessments and mitigation strategies, especially when processing sensitive biometric data.
- Equality Obligations: Agencies must rigorously adhere to the PSED, proactively identifying and addressing potential biases and discrimination arising from surveillance technologies.
- Policy Development: There is a pressing need for standardized, detailed policy documents governing AFR use, possibly at a national level, to minimize discretion and enhance accountability.
Future cases involving AFR and similar technologies will likely reference this judgment, reinforcing the necessity for robust legal and equality safeguards in surveillance practices.
Complex Concepts Simplified
To elucidate the nuanced legal concepts addressed in the judgment:
- Article 8 of the ECHR: Protects the right to respect for private and family life, home, and correspondence. Any interference must be lawful, necessary, and proportionate.
- Automated Facial Recognition (AFR) Technology: Uses algorithms to identify individuals by comparing facial features captured via CCTV with those on a watchlist.
- Proportionality Test: A legal mechanism to assess whether the benefits of a measure justify its intrusion into individual rights. It involves evaluating the importance of the objective, the connection between the measure and the objective, the necessity of the measure, and the balance between individual rights and community interests.
- Data Protection Impact Assessment (DPIA): A tool to identify and mitigate risks related to data processing activities, especially when handling sensitive data like biometric information.
- Public Sector Equality Duty (PSED): Mandates public authorities to eliminate discrimination, advance equality of opportunity, and foster good relations between different groups.
These simplified explanations facilitate a clearer understanding of the judgment's significance and its ramifications for legal practices surrounding surveillance technologies.
Conclusion
The Court of Appeal's decision in Bridges v South Wales Police serves as a crucial checkpoint in ensuring that the deployment of advanced surveillance technologies like AFR aligns with fundamental human rights and equality standards. By highlighting deficiencies in SWP's legal framework, DPIA, and equality assessments, the judgment underscores the imperative for law enforcement agencies to adopt meticulous, transparent, and accountable practices when implementing such technologies.
Moving forward, police authorities must enhance their policies, conduct thorough impact assessments, and uphold equality obligations to maintain public trust and comply with legal mandates. This landmark ruling not only rectifies the shortcomings in SWP's AFR deployment but also sets a precedent that fortifies the protection of individual rights in the face of evolving technological capabilities.
The broader legal community, policymakers, and technology developers must heed this judgment to foster a balanced approach that leverages the benefits of AFR technology while safeguarding against potential abuses and discriminatory practices.
Comments