Platform liability for misuse of private information – scope of the EU’s hosting defence

Just before Christmas the Northern Ireland Court of Appeal (“NICA”) gave its judgment in CG vs Facebook Ireland Limited ("Facebook") and James McCloskey [1]. In a widely reported first instance decision, CG, a convicted sex offender, had been awarded £20,000 in damages for the misuse of private information and harassment. Whilst some elements of the first instance decision were overturned, the NICA held that Facebook was liable in damages for misuse of certain private information hosted on its website, but not liable in damages under the Data Protection Act. This means the £20,000 figure is likely now to be reduced.

0
1622

The case concerned a Facebook page published by Mr McCloskey called “Keeping Our Kids Safe from Predators 2”. At various times this page featured CG’s image and details of the area in which he lived. Cumulatively, certain information relating to CG’s address or whereabouts was held to be private. In addition, some comments on the pages were of an abusive and threatening nature. All of the information was hosted by Facebook.

The two key issues on appeal were:

  1. the extent to which Facebook had actual knowledge of misuse of private information on its website, or knowledge of the facts and circumstances which made it apparent that the information published on its website was private; and
  2. whether Facebook was established in the UK under the Data Protection Act by virtue of its relationship with Facebook UK Limited.

E-Commerce Directive – knowledge is key

Article 14 of the E-Commerce Directive (the hosting defence) provides that an internet service provider storing unlawful material is not liable to pay damages provided that: (a) it does not have “actual knowledge” of the unlawful activity and is not aware of the facts or circumstances from which the unlawful activity would have been apparent; and (b) upon obtaining such knowledge or awareness it acts expeditiously to remove the material. This provision gives rise to the ‘notice and take-down’ procedures operated by almost all online platforms.

In the CG case, the Court analysed various instances in time at which CG alleged Facebook had the requisite knowledge. The Court took a broad brush approach to this issue and did not require CG to have used the take-down procedure provided by Facebook as long as, overall, Facebook had knowledge that the published material was private and of the location of the page on which it was published (i.e. the URL). According to the Court, it did not even matter that CG’s solicitors had not in fact complained specifically of misuse of private information (the cause of action on which CG ultimately succeeded) but relied instead on defamation and interference with the right to life arising from users’ comments (claims which were described in the judgment as “entirely misconceived”). Nor did it matter that the URL and identification of the unlawful information were not in the same correspondence. The Court did however make clear that “constructive knowledge” (i.e. the argument that Facebook “should have known”, which would effectively require Facebook to conduct a monitoring exercise), was not the correct test.

In light of the course of correspondence as a whole, the Court held that at one particular point in time (out of several alleged) Facebook had knowledge of the information about the location of CG’s residence and the location of the page on which it was posted. Therefore at this point Facebook did have the requisite knowledge of the facts and circumstances which made it apparent that the published material was private, according to the ruling. Facebook was therefore liable in damages for hosting the material for a short period before it was ultimately taken down.

Unlike the Court at first instance, the NICA agreed with Facebook that certain information said to be private, such as details of CG’s conviction were not in fact so: it all depended on the context.

Data Protection

The Court held that Facebook was a data controller in relation to the information by virtue of its relationship with Facebook (UK) Limited which engaged in the “effective and real exercise of activity through stable arrangements in the UK”.

The Court relied heavily on the decision of the CJEU in Google Spain vs AEPD and Gonzalez C-131/12 [2014] which concerned the removal of personal data appearing in search results. In that case Google Inc (the parent company of Google Spain) was held to be established in Spain by virtue of the activities of its Spanish subsidiary.

This aspect of the judgment reflects an increasing trend of national courts in Europe finding jurisdiction over platforms based in the US (and elsewhere) based on the activities of local subsidiaries.

Nonetheless, despite its finding on jurisdiction, the NICA accepted Facebook’s argument that it could not be liable in damages for the data protection claim because it was entitled to rely on Article 14 of the E-Commerce Directive (hosting defence).

Comment

Consistent with the current EU legislative regime, the NICA stopped short of holding that Facebook should have monitored for unlawful postings by McCloskey (which would be contrary to Article 15 of the E-Commerce Directive which prohibits a general monitoring obligation). However, it was still held that Facebook had the requisite knowledge as a result of the combined information available to it, rather than requiring it all to be set out in one place. Facebook was not permitted to rely simply on their ‘notice and take-down’ procedure, according to the judgment. This aspect of the decision is in keeping with other judgments from the Courts of Northern Ireland, which is increasingly being seen as a rather platform-unfriendly jurisdiction.

This approach reflects the prevailing direction in which the regulation of platforms and other online service providers in the EU appears to be heading. For example, the proposed Copyright in the Digital Single Market Directive contains provisions to require information society service providers (which includes platforms) that store/provide access to large amounts of user-uploaded content to take “appropriate and proportionate” measures to identify infringing content. Simultaneously, the proposed amendments to the Audio-Visual Media Services Directive contain provisions requiring video-sharing platform providers to protect minors from harmful content and everyone from incitement to violence or hatred.

Although the draft measures will be heavily contested and remain some way from becoming law, there is certainly a risk to online platforms that they will be required to adapt to an erosion of the hosting defence in the EU in the coming years and, potentially, to be more proactive in removing unlawful content. This means online platforms may need to become ever more accustomed to dealing with the subtleties of various national laws within EU Member States, especially where the approach to freedom of speech is so different between the US and Europe.

[1] CG v Facebook Ireland Limited and Joseph McCloskey [2016] NICA 54 – read the full judgment here.

Leave a Reply