Menu

Data Protection – March 2019

Shield defense on blue technology background with digital code Print publication

01/04/2019

Latest from the ICO, including focus on artificial intelligence; recent enforcement action; news from Walker Morris risk series stampEurope; and more.

Latest from the Information Commissioner’s Office (ICO) – focus on artificial intelligence

On 18 March 2019, the ICO issued a call for participation from organisations on the development of an auditing framework for artificial intelligence (AI), on the ICO’s new AI auditing framework blog site. As the ICO’s Executive Director for Technology Policy and Innovation explains: “Applications of AI are starting to permeate many aspects of our lives…We know the benefits that AI can bring to organisations and individuals. But there are risks too… The law requires organisations to build-in data protection by design and to identify and address risks at the outset by completing data protection impact assessments. Privacy and innovation must sit side-by-side. One cannot be at the expense of the other. That’s why AI is one of our top three strategic priorities”. Feedback will inform a formal consultation paper which the ICO expects to publish by January 2020. The final framework and associated guidance for firms is then intended to be published by spring 2020.

A follow-up post published on 26 March 2019 outlines the proposed structure, core components and areas of focus of the new framework. The two key components are:

  • governance and accountability, which will discuss the measures an organisation must have in place to be compliant with data protection requirements; and
  • eight identified AI-specific risk areas, which will focus on the potential data protection risks that may arise in a number of AI-specific areas and what the adequate risk management practices would be to manage them.

The idea is that, over the next six months or so, the blog site will be updated every two to three weeks with posts looking in more detail at each of these areas, exploring the associated risks. They are: fairness and transparency in profiling; accuracy; fully automated decision making models; security and cyber; trade-offs (i.e. the challenges of balancing different constraints when optimising AI models); data minimisation and purpose limitation; exercising of rights; and the impact on broader public interests and rights.

In other news:

  • An annual investigation by the Global Privacy Enforcement Network, involving data protection authorities from around the world, shows that organisations should be doing more to achieve privacy accountability. See the ICO’s press release.
  • The Joint ICO-Ofcom action plan to address the consumer harm caused by nuisance calls and messages was recently updated.
  • The ICO published a blog post setting out practical advice and tips for medical practices on dealing with the increasing number of data subject access requests which, under the new data protection regime, must be processed free of charge and within one month.
  • A fact-finding forum on adtech took place on 6 March 2019, prompted by concerns about how people’s personal data is used in real-time bidding in programmatic advertising. Discussions centred on the three themes of transparency, lawful basis and security. See this summary report and blog post, which set out what happened at the event and the consensus on a need for change.

Recent ICO enforcement action

  • A pensions company which relied on ‘misleading’ professional advice on the use of hosted marketing, was fined £40,000 for sending nearly two million spam emails.
  • Vote Leave was also fined £40,000 after it sent thousands of unsolicited text messages in the run up to the EU referendum. Among other things, the ICO Director of Investigations said: “Political campaigns and parties, like any other organisations, have to comply with the law”.
  • Two workers were fined in separate cases for breaching data protection laws. The first had unlawfully accessed personal records while employed at an NHS Foundation Trust, while the second had forwarded work emails containing the personal data of customers and other employees to her personal email account, weeks before she resigned.
  • Two addresses were searched by the ICO as part of an investigation into businesses suspected of making live and automated nuisance calls.

Regulating in a digital world

On 9 March 2019, the House of Lords Select Committee on Communications published a report titled ‘Regulating in a digital world’, which sets out ten principles that “should guide the development and implementation of regulation online and be used to set expectations of digital services”.

On the subject of ethical technology the report concludes that, while the General Data Protection Regulation (GDPR) and Data Protection Act 2018 provide valuable safeguards, including subject access rights to ensure that data is accurate and up to date and the right to opt out from purely automated processing, there are weaknesses in the regime. It gives the example that a subject access request does not give subjects automatic access to behavioural data generated about them because it is deemed to be the property of the company that acquired it. The report sets out a series of recommendations.

News from Europe

The European Data Protection Board (EDPB) met for its eighth plenary session on 12 and 13 March 2019. Among other things, the EDPB adopted a statement on the ePrivacy Regulation, in which it called on EU legislators to intensify efforts towards adoption of the Regulation, and stressed that the Regulation must under no circumstances lower the level of protection offered by the current ePrivacy Directive and must complement GDPR by providing additional strong guarantees for all types of electronic communications.

The EDPB also adopted a statement and accompanying annex on the use of personal data in the course of political campaigns.

On 12 March 2019, the European Parliament adopted the EU Cybersecurity Act, which establishes the first EU-wide cybersecurity certification scheme to ensure that certified products, processes and services sold in the EU meet cybersecurity standards. The Parliament also adopted a resolution calling for action at EU level on the security threats linked to China’s growing technological presence in the EU. According to the press release, MEPs are deeply concerned about recent allegations that 5G equipment may have embedded backdoors that would allow Chinese manufacturers and authorities to have unauthorised access to private and personal data and telecommunications in the EU.

Contacts