Cambridge University Society for Women Lawyers

Kindly Sponsored by

Ashurst Logo.png
  • CUSWL

Analysing our Better Angles: Facial Recognition Technology and Duty of Equality

Written by the Winner of the CUSWL Law and Tech Essay Competition 2021 - Holli Sargeant



Facial recognition technology has come under microscopic scrutiny, allowing detailed engagement with the legal, ethical and policy challenges posed by various new technologies. In the current digital milieu, the convergence of various technologies and the scale of their various applications makes it challenging to analyse the extensive concerns about their design, deployment, and use. However, allowing the narrow policy analysis of one type of technology has opened the floodgates to various valuable discussions about how we regulate emerging technology. Facial recognition technology’s omnipresence raises suspicions of mass surveillance, lack of transparency, invasions of privacy, and personal autonomy.


There are too many applications of facial recognition technology to cover in this article. Prominent examples include China’s social credit system, which evidences the potential operation in social control and surveillance. Similarly, law enforcement agencies have deployed facial recognition technology in live public surveillance (including monitoring Black Lives Matter protests and the Washington D. C. riot) and governments have surveilled social distancing guidelines during the COVID-19 pandemic through facial recognition technology.


Besides general concerns about using this technology, there are significant legal risks pertaining to its inaccuracy, disproportionately against women and people of colour, and therefore potential discrimination. Public outcry brought many of these issues to the foreground. Key triggers for advocates included the New York Times investigation into Clearview AI, key research from academics including Safiya U. Nobel, Ph.D. and Joy Buolamwini, and the lack of transparency from law enforcement, such as the Australian Federal Police using controversial facial recognition technology. As a result, many companies ceased to develop or sell this technology, various governments imposed regulatory interventions such as bans and moratoriums on facial recognition technology, and cross-jurisdictional investigations into the practices of certain technology providers began.


Among a flurry of academic commentary, government guidance and consultation, and private sector announcements, the Court of Appeal decided the first court case to consider the legal status of law enforcement’s use of automated facial recognition technology: R (on the application of Edward BRIDGES) v The Chief Constable of South Wales Police [2020] EWCA Civ 1058. Ed Bridges, alongside Liberty, won this world-first legal challenge. The Court of Appeal held that the use of the technology by the South Wales Police was unlawful as it violates privacy rights, and breaks data protection and equality laws.


While a landmark case because of its subject matter, it solidified the connection between deploying emerging technology in compliance with existing equality laws and values. One important and interesting principle to arise in this case is the Court’s determination that the South Wales Police did not comply with the Public Sector Equality Duty in the Equality Act 2010 section 149.


The Court affirmed public concern about the relationship between police and BAME communities. This tension underlies and affirms the need for the Public Sector Equality Duty. Public agencies must eliminate discrimination, advance equality of opportunity and foster good relations when considering ‘a new policy… which may turn out in fact to have a disproportionate impact on certain sections of the population’.


The Court outlined that the South Wales Police did not seek to satisfy themselves, either directly or by independent verification, that the software program, in this case, does not have an unacceptable bias on grounds of, in particular, race or sex. The Court went further to clarify that even though the software developer did not divulge the details of its algorithm for commercial confidentiality; it did not discharge the non-delegable Public Sector Equality Duty.

This point of law is of interest, as further public agencies use novel and controversial technologies. They will require legal and technical solutions to ensure that the relevant software does not have a bias based on a protected attribute. Even where organisations are not bound by the Public Sector Equality Duty, this serves as a useful reminder of to the private sector of the public concern relating to algorithmic bias or technologies’ disproportionate impact on certain groups within society.


Considering the unprecedented pace of technological change, it is impractical to focus the regulatory techniques that focus on technology as the object of regulation. Rather, as shown by the Court's application of existing legal principles, regulation should focus on the outcomes of technology and should aim for technology neutrality.


Many existing laws apply to emerging technology in ways that do not need unique contortions. However, technology can prevent potentially harmful, or insufficiently rights-protective, practices. Inherently this requires greater inter-disciplinary research and cross-pollination of lawyers and technologists.


Technology will not eradicate the vagaries of human judgement or bias. Facial recognition technology used by the South Wales Police provides the perfect example of how it may embed them. However, there is plenty of opportunity for the deployment of technology with the assurances and design to enable stronger rights protections.


Get in touch if you would like to write for our blog!


115 views

Subscribe

Contact