As part of our recent series of articles on facial recognition cameras, we have touched upon the ongoing case taken up against South Wales Police by a member of the public over their use of an automated facial recognition (AFR) security camera installation. Earlier this month, The High Court ruled that the use of the AFR was justified – so let’s unpack what went on and question whether we’ll see such tech used more in the near future.
The Case Against The AFR Security Camera Installation.
Whilst out Christmas shopping in Cardiff during December 2017, Ed Bridges was photographed by the AFR unit installed by South Wales Police to test its capabilities to identify any known/wanted offenders.
Noticing the unit, Mr Bridges claimed that his human rights were being breached because the camera was ‘infringing upon his privacy’, as well as breaching data protection laws. Facial recognition security camera installations have also been noted for their inadequate performance; especially when identifying black and ethic minatory persons. As such, the case also included a claim of a breach of equality law.
However, the High Court rejected all of the claims and ruled that the police were using the camera lawfully. Their key findings were:
- That South Wales police’s use of AFR met the requirements of the Human Rights Act and its actions were subject to sufficient legal controls.
- That, although the police were processing personal data, they were doing so in a lawful manner that met the conditions set out in legislation.
- That the force had complied by equality laws.
- That the current legal system is adequate to ensure the proper and non-arbitrary use of AFR.
The Reactions To The Court Ruling For Facial Recognition Cameras.
Mr Bridges has already indicated that he will appeal the ruling; speaking before the case, he said that, “South Wales Police has been using facial recognition indiscriminately against thousands of innocent people, without our knowledge or consent. This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance.”
After the ruling was made, the human rights organisation, Liberty (who supported Mr Bridges during the case), said, “This disappointing judgment does not reflect the very serious threat that facial recognition poses to our rights and freedoms. Facial recognition is a highly intrusive surveillance technology that allows the police to monitor and track us all.”
However, South Wales Police, whilst recognising that the use of AI and face-matching technologies can be of great concern [to the general public], were pleased that the court had recognised the responsibility that they had shown in their programme. Chief constable, Matt Jukes, said, “There is, and should be, a political and public debate about wider questions of privacy and security. It would be wrong in principle for the police to set the bounds of our use of new technology for ourselves.”
Meanwhile, the Information Commissioners Office (which intervened in the case) said, “We welcome the court’s finding that the police use of live facial recognition systems involves the processing of sensitive personal data of members of the public, requiring compliance with the Data Protection Act 2018. This new and intrusive technology has the potential, if used without the right privacy safeguards, to undermine rather than enhance confidence in the police.”
What’s The Public Mood on Face Recognition Cameras?
The cautious way that the police use the technology seemingly fits with public opinion. The Ada Lovelace Institute, an independent AI and data research and deliberative body, surveyed more than 4,000 people just before to the ruling announcement on the viability of police using an AFR security camera installation. Although a majority (55%) said that they wanted to impose restrictions on how police use facial recognition, 49% supported its use (assuming that appropriate steps were taken to safeguard the data it captured).
Other key outcomes to emerge from the survey include:
- Most people oppose the use of facial recognition by companies for commercial benefit, as they don’t trust that it will be used ethically. 77% of those surveyed were uncomfortable with it being used to track a customers’ ‘journey’, whilst 76% were uncomfortable with it being used by HR departments in recruitment.
- Whilst people are prepared to accept the police using facial recognition technology when there is a clear public benefit and safeguards are taken, a clear majority are opposed to its use in schools (67%) and on public transport (61%).
With an appeal from Mr Bridges set to be lodged and public opinion still divided over the use of an AFR security camera installation, it may be some time before such tech becomes commonplace in society. In the meantime, there is a plethora of ways a business can protect themselves, their staff and their visitors by publicly-acceptable means.
Looking To Improve The Security of Your Business?
Here at Tellivue, we are one of the leading retail security providers in the UK. Having been installing all manner of security camera installation devices for over 20 years now, we have seen the trends in technology change during that time. Moving along with these changes, we are now in a position in which we are able to offer expert advice on the changing landscape of the industry, as well as how recent innovations can combat retail crime, further reducing profit loss.
So, if you’re looking improve or assist your own retail loss prevention methods, why not get in touch with us today to learn more about how we can help? Give our friendly team a call on 020 7846 3300 or send an e-mail to [email protected] and we’ll get back to you as soon as we can.