Data Protection – April/May 2019

Binary Code Print publication


Latest from the ICO; recent enforcement action; cybersecurity update; Morrisons data breach appeal; and more.Walker Morris risk series stamp

Latest from the Information Commissioner’s Office (ICO)

The ICO has updated its guidance on certification schemes under the General Data Protection Regulation (GDPR). Certification is a way for an organisation to demonstrate its compliance with GDPR. Pending finalisation of European Data Protection Board (EDPB) guidelines, the ICO says that it welcomes enquiries from organisations who are considering developing a GDPR certification scheme.

Similarly, pending finalisation of EDPB guidelines, the ICO is also welcoming enquiries from representative organisations who are considering developing codes of conduct. Under GDPR, trade associations and representative bodies may draw up codes of conduct that cover topics that are important to their members, such as fair and transparent processing, pseudonymisation or the exercise of people’s rights. See the codes of conduct section of the ICO’s Guide to the GDPR for more details, including timelines and next steps. In a submission to the EDPB, Insurance Europe has warned that the EDPB draft guidelines on codes of conduct and monitoring bodies go beyond the text of GDPR, because the draft guidelines say the approval of a code of conduct will depend on the appointment of a mandatory body that shall police compliance with the code, when in fact the GDPR says this is optional.

The ICO also recently updated its Guide to freedom of information on refusing a request and has changed the sections on: when you can use an exemption to neither confirm nor deny you hold information, if to do so would disclose personal data; and when you can refuse a request because it contains personal data.

On 8 May 2019, the ICO launched a ‘Be Data Aware’ campaign to help people understand how organisations might be using their data to target them online, and how people can control who is targeting them. See this blog post for details.

In an earlier blog post, the ICO reminded public and private organisations that new data protection legislation does not stop them from disclosing personal data to assist police forces or other law enforcement authorities.

Three further posts have been added to the ICO’s new AI auditing framework blog: the first covers automated decision-making and the role of meaningful human reviews; the second explores how the data protection principle of accuracy applies to AI systems, and proposes some steps organisations should take to ensure compliance; and the third looks at known security risks exacerbated by AI.

On 15 April 2019, the ICO launched a consultation on a code of practice for online services which sets out the standards expected of those responsible for designing, developing or providing online services likely to be accessed by children and which process their data. The consultation closes on 31 May 2019. In related news, the ICO responded in a statement following publication of the government’s Online Harms White Paper, which sets out the government’s plans for “a world-leading package of online safety measures that also supports innovation and a thriving digital economy”. The consultation on the White Paper closes on 1 July 2019.

Recent ICO enforcement action

The ICO has reminded organisations of the consequences of failing to pay the required data protection fee, after Farrow and Ball’s appeal of a £4,000 fine for non-payment was dismissed by the first-tier tribunal [1]. The person responsible within the company was on holiday when a reminder was sent (note that the ICO does not have to send reminders) and the correspondence was not recognised as important internally. The tribunal concluded that a reasonable data controller would have systems in place to comply with the relevant regulations and that the company had pointed to no particular difficulty or misfortune which explained its departure from the expected standards of a reasonable data controller.

In other enforcement news:

  • A pregnancy and parenting club, which collected personal information for the purpose of membership registration but also operated as a data broking service, was fined £400,000 under the old Data Protection Act after it shared personal information with a number of organisations without being fully clear with people that it might do so.
  • A PPI claims company was fined £120,000 for sending more than 3.5 million unlawful direct marketing text messages about its services. The company claimed that consent had been obtained when people subscribed to one of four websites. However, it was named on only two of those websites’ privacy policies and people were required to give consent to receive marketing from third parties as a condition of subscribing, which is against the law.
  • A television production company was also fined £120,000 for unlawfully filming patients at a maternity clinic. Although the company had the hospital trust’s permission to be on site, patients were not provided with adequate information and the company did not obtain adequate permission from those affected by the filming in advance.
  • A company selling funeral plans was fined £80,000 after almost 52,000 calls were made to people who were registered with the Telephone Preference Service.
  • The London Borough of Newham was fined £145,000 for disclosing sensitive personal data about alleged gang members. An ICO investigation found that the Council did not report the breach, nor did it have any specific sharing agreements, policy or guidance in place to determine how its own staff and partner organisations should handle and use the ‘Gangs Matrix’ databases securely.
  • A former NHS manager was fined for sending personal data of job applicants to her own email account without authorisation, a few days after she was suspended from the surgery where she was working.
  • In relation to the use of voice authentication for customer verification on some of HMRC’s helplines, an ICO investigation found that HMRC had failed to give customers sufficient information about how their biometric data would be processed and failed to give them the chance to give or withhold consent. This was a breach of GDPR and HMRC was ordered to delete any data it continued to hold without consent. In a recent blog post, the ICO’s Deputy Commissioner for Policy highlights the key points that organisations need to consider if they are planning on using new and innovative technologies that involve personal data, including biometric data.

Cybersecurity update

On 3 April 2019, the government published the results of its Cyber Security Breaches Survey 2019. A summary of the findings can be found on pages 2 to 5, with the conclusions on pages 60 and 61. Among other things, the report says that there is still more that organisations can do to protect themselves from cyber risks. This includes taking important actions that are still relatively uncommon, around board-level involvement in cybersecurity, monitoring suppliers and planning incident response. It says that the survey shows, in particular, how the GDPR has accelerated the pace of change across organisations, but that GDPR may only take organisations up to a certain point. Beyond this, the findings show that there is still room for a more holistic approach to cybersecurity. It says that the findings continue to highlight the importance of board-level engagement with cybersecurity. Instilling better knowledge and understanding of cybersecurity across board members can be the difference between cybersecurity being treated as a fairly high priority, or a very high priority.

The government is consulting until 5 June 2019 on proposals for new mandatory industry requirements to ensure consumer smart devices adhere to a basic level of security. See the press release for details.

The government is also consulting, until 11 June 2019, on the UK’s proposed approach to regulating non-UK based digital service providers operating in the UK under the Network and Information Systems Regulations 2018 post-Brexit.

Other news

Supermarket chain Morrisons has been granted permission to appeal to the Supreme Court against last year’s Court of Appeal decision upholding a High Court ruling that the company was vicariously liable in damages for the actions of one of its former employees who, while employed as a senior internal auditor at the company, deliberately leaked payroll data relating to almost 100,000 employees online following disciplinary action. Walker Morris will continue to monitor and report on developments.

In Rudd v Bridle [2], the High Court provided guidance on the correct approach to handling data subject access requests. Watch out for our separate upcoming briefing on this topic.

Facebook has updated its terms to explain clearly how the company uses its users’ data to develop profiling activities and target advertising to finance the company. The move follows discussions with the European Commission and consumer authorities. See the Commission’s press release for details.

On the subject of Facebook, the Information Commissioner commented on an opinion piece in The Washington Post, in which Facebook’s chief executive Mark Zuckerberg called for more regulation of the internet. She said that she expected Facebook to review its current appeal against the £500,000 ICO fine issued to the company for contravening UK privacy laws.

And finally, the EDPB has recently been consulting on draft guidelines on the contracts lawful basis for processing personal data under GDPR in the context of online services. Again, Walker Morris will continue to monitor and report on developments.


[1] Farrow and Ball Limited v The Information Commissioner, EA/2018/0269
[2] [2019] EWHC 893 (QB)