On 4 June 2018, Information Commissioner Elizabeth Denham told MEPs that she was ‘deeply concerned’ about the misuse of social media users’ data.
Information Commissioner Denham was speaking at the European Parliament's Committee on Civil Liberties, Justice and Home Affairs (LIBE) inquiry into the use of 87 million Facebook profiles by Cambridge Analytica and its consequences for data protection and the wider democratic process. The whole affair has shone a light on how Facebook collected, shared, and used data to target people with political and commercial advertising. And, in a warning to social media giants, she announced:
“Online platforms can no longer say that they are merely a platform for content; they must take responsibility for the provenance of the information that is provided to users.”
Although this is tough talk from the UK’s guardian of information rights – and many others, including politicians, have used similar language – the initial response from the Information Commissioner was hardly swift.
The Information Commissioners Office (ICO) struggled at the first hurdle, failing to secure a search warrant for Cambridge Analytica’s premises. Four days after the Information Commissioner Denham announced her intention to raid the premises, she was eventually granted a warrant after a five-hour hearing at the Royal Courts of Justice. This delay – and concerns over the resources available to the ICO – led commentators to question whether the regulator has sufficient powers to tackle tech giants such as Facebook.
Unsurprisingly, it was not long before Information Commissioner Denham went into “intense discussion” with the Government to increase the powers at her disposal. At a conference in London, she explained:
“Of course, we need to respect the rights of companies, but we also need streamlined warrant processes with a lower threshold than we currently have in the law.”
Conservative MP, Damien Collins, Chair of the Digital, Culture, Media and Sport select committee, expressed similar sentiments, calling for new enforcement powers to be included in the Data Protection Bill via Twitter.
Eventually, after a year of debate, the Data Protection Act 2018 was passed on the 23 May. On the ICO blog, Information Commissioner Elizabeth Denham welcomed the new law, highlighting that:
“The legislation requires increased transparency and accountability from organisations, and stronger rules to protect against theft and loss of data with serious sanctions and fines for those that deliberately or negligently misuse data.”
By introducing this Act, the UK Government is attempting to address a number of issues. However, the Information Commissioner, will be particularly pleased that she’s received greater enforcement powers, including creating two new criminal offences: the ‘alteration etc of personal data to prevent disclosure‘ and the ‘re-identification of de-identified personal data’.
On 25 May, the long-awaited General Data Protection Regulation (GDPR) came into force. The Data Protection Act incorporates many of the provisions of GDPR, such as the ability to levy heavy fines on organisations (up to €20,000,000 or 4% of global turnover). The Act also derogates from EU law in areas such as national security and the processing of immigration-related data. The ICO recommend that GDPR and the Data Protection Act 2018 are read side by side.
However, not everyone is happy with GDPR and the new Data Protection Act. Tomaso Falchetta, head of advocacy and policy at Privacy International, has highlighted that although they welcome the additional powers given to the Information Commissioner, there are concerns over the:
“wide exemptions that undermine the rights of individuals, particularly with a wide exemption for immigration purposes and on the ever-vague and all-encompassing national security grounds”.
In addition, Dominic Hallas, executive director of The Coalition for a Digital Economy (Coadec), has warned that we must avoid a hasty regulatory response to the Facebook-Cambridge Analytica scandal. He argues that although it’s tempting to hold social media companies liable for the content of users, there are risks in taking this action. He explains:
“Pushing legal responsibility onto firms might look politically appealing, but the law will apply across the board. Facebook and other tech giants have the resources to accept the financial risks of outsized liability – start-ups don’t. The end result would entrench the positions of those same companies that politicians are aiming for and instead crush competitors with fewer resources.”
The Facebook-Cambridge Analytical scandal has brought privacy to the forefront of the public’s attention. And although the social media platform has experienced declining user engagement and the withdrawal of high-profile individuals (such as inventor Elon Musk), its global presence and the convenience it offers to users suggests it’s going to be around for some time yet.
Therefore, the ICO and other regulators must work with politicians, tech companies, and citizens to have an honest debate on the limits of privacy in a world of social media. The GDPR and the Data Protection Act provide a good start in laying down the ground rules. However, in the ever-changing world of technology, it will be important that this discussion continues to find solutions to future challenges. Only then, will we avoid walking into another global privacy scandal.