Trusted Data Privacy Partners
Trusted Data Privacy Partners
Signed in as:
filler@godaddy.com
Data Privacy Risk and Facial Recognition Technology
Data Privacy Risk and Facial Recognition Technology
The Guardian recently reported that a group of 50 Members of Parliament and peers have collectively sent a letter to Fraser Group, the corporate owner of retail brands like House of Fraser and Sports Direct, led by Mike Ashley. The letter strongly criticises the implementation of "live facial recognition" cameras within the group's stores, which could lead to violation of personal privacy and misidentification.
The cross-party coalition of MPs including David Davis (Conservative), John McDonnell (Labour), Tim Farron (Liberal Democrat), and Caroline Lucas (Green) have labelled the technology as "intrusive and biased,” and have strongly urged the Fraser Group to stop the implementation of these cameras throughout the nation. Privacy groups Big Brother Watch, Liberty and Privacy International, also co-signed the letter and argued that, in addition to being wrong on principle, facial recognition technology is "inaccurate and ineffective." "To date, 87% of alerts generated by the Metropolitan Police's own live facial recognition system have been incorrect." The inaccuracy of LFR technology also disproportionately affects persons of colour and women."
At least 27 stores, including 13 Flannels, 12 Sports Direct, and two USC stores have been equipped with biometric facial recognition cameras or so-called ‘Facewatch’ systems’, as stated by DailyMail. As of May 30th Fraser Group has not released an official public statement related to this protest, however according to Fraser’s spokesperson, surveillance is carried out to "ensure the safety of our employees and to help prevent theft." This statement is supported by CEO Nick Fisher, who said the system prevents thousands of crimes a month.
The usage of this technology led to the Information Commissioner's Office to issue a cautionary statement. According to the office, live facial recognition poses the possibility of inequitable treatment of persons because it includes rapid and comprehensive acquisition of biometric data without adequate explanation. Furthermore, the system does not allow targets to opt out or manage how their personal information is used. As a result, the office emphasised that when live facial recognition is used to collect biometric data indiscriminately in public settings, there is a stringent legal need for its use to be regarded as permissible.
To understand more about the live facial recognition usage in public places, you can click this link to read the ICO’s opinion about the technology.
The ICO has announced in a provisional notice its intention to fine TikTok £27 million after the commissioner’s investigation found that the company violated the UK Data Protection Law by failing to protect children’s privacy while using the platform. The notice set out that TikTok violated the UK data protection law between May 2018 and July 2020. The notice is provisional, and no conclusion has been drawn at this stage.
Click here to read the official press release from the ICO and click here to read the news coverage from TechCrunch related to the issue.
On September 15th, Ireland’s Data Protection Commissioner (DPC) announced the conclusion of its inquiry of Meta Platforms Ireland Ltd with a €405 million (£360 million) fine and a range of corrective measures for violating children’s privacy on Instagram. The DPC started its investigation after receiving information provided by a US data scientist, David Stier in September 2020 showing how children’s data is processed on the Instagram social networking platform.
After months of comprehensive investigations and court administrative process, the decision records find of several violations of Article 5(1)(a), 5(1)(c), 6(1), 12(1), 24, 25(1) 25(2), and 35(1) of the GDPR.
Meta Platform Ireland Limited indicated filing an appeal against the fine and disagreed with how the fine was calculated, as stated in the Irish Times. Meta said that they engaged fully with the regulator throughout the investigation and the default setting has been changed.
Click here to read the complete press release from the Ireland DPC.
The European Parliament has made significant progress towards ensuring a human-centric and ethical development of Artificial Intelligence (AI) in Europe as the AI Act passed its first hurdle. Members of the European Parliament (MEPs) have now endorsed new rules for AI systems. The proposed legislation promotes human oversight, safety, non-discrimination, and environmental friendliness and prohibited practices include manipulative techniques and social scoring. High-risk areas expand to include health, safety, fundamental rights, the environment, and political campaigns.
MEPs significantly updated the list to include prohibitions on the intrusive and discriminatory use of AI systems, such as:
1) “Real-time” remote biometric identification systems in publicly accessible spaces;
2) “Post” remote biometric identification systems, with the only exception of law enforcement for the prosecution of serious crimes and only after judicial authorization;
3) Biometric categorisation systems using sensitive characteristics (e.g. gender, race, ethnicity, citizenship status, religion, political orientation);
4) Predictive policing systems (based on profiling, location or past criminal behaviour);
5) Emotion recognition systems in law enforcement, border management, workplace, and educational institutions; and
6) Indiscriminate scraping of biometric data from social media or CCTV footage to create facial recognition databases (violating human rights and right to privacy).
Click here to read the official press release by the European Parliament and you can read the news coverage by the Euractive and Forbes related to the case, here and here.
The ICO has filed criminal charges against eight people for allegedly conspiring to illegally access and obtain people’s personal information from vehicle repair garages to generate potential leads for personal injury insurance claims. The activity took place across the UK between December 1st, 2014, to November 20th, 2017, and hundreds of thousands of individual’s personal data from in road traffic accidents was collected without their consent.
The eight defendants now will face two charges of conspiring to commit an offence under section 1 of the Computer Misuse Act 1990 (allegedly unlawful accessing of personal data held on the computer) and section 55 of the Data Protection Act 1998 (allegedly unlawful obtaining of personal data). This case’s first hearing will occur at Manchester and Salford Magistrates Court on October 27th, 2022.
For more details related to the news, please click here.
The UK Information Commissioner’s Office (ICO) has published its Privacy-Enhancing Technologies (PETs) draft guidance The ICO describes PETs as technologies that embody fundamental data protection principles by minimising personal data use, maximising data security, and/or empowering individuals. Linked to the concept of ‘Data Protection by Design and by default’, PETs should:
1. Assist compliance with the principle of data minimisation by ensuring that organisations or individuals only process the data they need for their purpose.
2. Provide an appropriate level of security.
3. Implement robust anonymisation or pseudonymisation solutions.
4. Minimise the risk that arises from personal data breaches.
John Edwards, the UK Information Commissioner’s said:
“It’s not just regulators that need to take action – we need the industry to step up, too. We want organisations to come to us with codes of conduct and certification schemes, for example, to show their commitment to building services or products that are designed in a privacy-friendly way and protect people’s data”
The ICO encourages organisations and/or individuals to carefully assess the impacts of the decision-making process, purpose specification, and how organisations and/or individuals can comply with accuracy and accountability requirements before using PETs.
To improve the final guidance and its functionality, the ICO calls for views and consultation on its updated draft guidance on anonymisation, pseudonymisation, and PETs until December 31st, 2022. For formal news from the ICO related to PETs, click here.
A niche firm with a specialist scope. We provide unparalleled expertise in data privacy law at affordable prices. Privacy Partnership Law is a new offering from the Privacy Partnership Group which also includes Privacy Partnership the UK's leading specialist data privacy consultancy. Privacy Partnership have been trusted advisors to UK based and international businesses, charities and Government on data protection compliance since 2000. Privacy Partnership Law offers Clients an increased range of data privacy and data protection advice and support from established and respected data protection lawyers in our expert team and from around the world.
Privacy Partnership Law provides a full range of data privacy related legal services both in the UK and internationally through our network of specialist partner firms. We can support you when you deal with data protection complaints and data subject access requests, help you negotiate and review data privacy contract terms, conduct data privacy assessments and transfer impact assessments or conduct due diligence. We also offer a full range of consultancy services, outsourced DPO services and Smart Privacy technology through our sister consultancy company Privacy Partnership .
Privacy Partnership Law comprises some of the UK’s leading data privacy lawyers and consultants who enjoy a reputation for ground breaking privacy solutions which are both pragmatic and cost effective.
Our Lawyers and consultants have an international reputation for excellence. Our specialist teams mean we focus on giving you expert advice at a competitive cost.
Privacy Partnership Law
Privacy Partnership Law Ltd is regulated by The Solicitors Regulation Authority with registration number 829686 .
Privacy Partnership Law Ltd. is a registered company based in England and Wales with a registration number 13211514 - and a registered office at
3 Linhope Street, London, England, NW1 6ES. VAT number 401788010. It forms part of the Privacy Partnership Group of Companies.
Copyright © 2022 Privacy Partnership Law - All Rights Reserved no part of this website may be copied or reproduced without permission.
We use necessary cookies to make our site work. We would also like your permission to set optional analytics cookies to help us improve it. Clicking 'Accept' below will set cookies on your device to remember your preferences. Find out more in our Privacy Policy or scroll down to read more about the different types of cookies.
Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.
Where you select "Accept" we set Google Analytics cookies to help us to improve our website by collecting and reporting information on how you use it. The cookies collect information in a way that does not directly identify anyone. For more information on how these cookies work see https://developers.google.com/analytics/devguides/collection/analyticsjs/cookie-usage?hl=en-US