Skip to main content
SearchLoginLogin or Signup

AG Opinion on C-18/18: Towards private regulation of speech worldwide

Published onJun 28, 2019
AG Opinion on C-18/18: Towards private regulation of speech worldwide

The case of Glawischnig-Piesczek v Facebook offers the opportunity for the Court of Justice to clarify the personal and material scope of monitoring obligations that may be imposed on Internet intermediaries, i.e. those private entities that ‘give access to, host, transmit and index content originated by third parties’.  The decision of the Court will determine whether domestic courts can impose monitoring obligations on digital platforms, and of what nature, and how much power courts should be given in imposing their own standards of acceptable speech across national boundaries. The opinion of the Advocate General, rendered earlier this month, raises some concerns for on-line freedom of expression because of its expansive approach to both monitoring obligations and jurisdictional limitations.

Facts of case

The case originates from a request by the former Austrian MP and spokeswoman of the Green Party to remove a post shared on her own private Facebook account by a user who used derogatory language to criticise a news article covering the refugee crisis in Austria and comments on the topic made by the plaintiff in an interview. After a request to remove the user’s post was declined by the platform, the plaintiff obtained from the Commercial Court of Vienna an interim injunction ordering Facebook to remove both the original post and any other post on the platform with ‘analogous’ content. This second part of the injunction was later removed by the Higher Regional Court of Vienna that considered it tantamount to an obligation of general monitoring and thus incompatible with the E-commerce Directive. The Higher Court however confirmed that Facebook should remove any future posts that would include the same derogatory text alongside any image of the plaintiff. Facebook appealed this decision to the Austrian Supreme Court.

The Court referred to the CJEU two main ranges of questions:

- First, whether ordering a host provider to remove posts that are ‘identically worded’ to other illegal content is compatible with Article 15(1) of the E-Commerce Directive. In case of a positive answer, the Court asks whether this obligation could expand beyond identical content and include content that is analogous in substance, despite a different wording. These are ultimately questions concerning the responsibility that platforms can be given in making their own assessment of what content amounts to unlawful speech, and what are the limits of “active monitoring”.

- Second, whether national courts can order platforms to remove content only within the national boundaries, or beyond (‘worldwide’). This is a question concerning the admissibility of extra-territorial injunctions for content removal.

The AG’s Opinion

The first point discussed by the AG concerns the nature of general active monitoring obligations, whose compatibility with EU law is explicitly excluded by the E-commerce Directive. What exactly amounts to an obligation of such kind has been often a point of contention in the last few years; the AG’s effort to clarify the contours of this notion is helpful to clarify some long-standing confusion. The Opinion raises a distinction between general and specific monitoring obligations, and finds the former fundamentally incompatible with the E-commerce Directive, as in fact complying a general monitoring duty would necessarily cause a platform to lose its neutrality and logically contradict the very principle that immunities are grounded in the passive nature of an intermediary.

With regard to the question on identical content, the AG indicates that for a monitoring obligation to be specific, it needs to be narrowly construed in regards to four factors: (i) the nature of the infringements, (ii) their author and (iii) their subject being the same as the original content, and (iv) the duration of the monitoring obligation needs to be limited. In the context of the case at stake, it follows that a court may order Facebook to remove all the re-posting of the original content made through the “share” function. The time limitedness resides instead in the very nature of the temporary relief ordered by the Austrian court; no further guidance is offered in terms of what could be a suitable timeframe for monitoring obligations ordered in the context of a final judgement.

With regard to equivalent content, the AG accepts that intellectual property and defamation work in different ways and the precedent established in L’Oréal v eBay is at least partially ill-suited in this case. The concepts of identity or equivalence of content do not have the same meaning in the two contexts, since in the case of copyright infringements the sameness between two items is precisely what ultimately makes the second iteration of the same content illegal, whilst very rarely defamation consists of precisely same terms as another defamatory act. As a result, the AG recommends that the question on removal obligations for equivalent content is answered in the positive, but limited to content coming from the same user as the original post.

What future for free speech on-line? The issue of technology-based solutions

It is evident that, in the reasoning of the Advocate General, the circumstance that identifying identical content would be an easy task to be performed via automated means plays a fundamental role in finding such an obligation compatible with the Directive. In fact, in lack of such ease of automation, this task could only be performed by employing ‘sophisticated solutions’ and an obligation in this sense would be incompatible with the Directive.

A determining factor is thus whether they can be fulfilled by automated means or require human intervention, as in this latter case the platform would lose its merely passive function. However, the conclusion that the removal of identical content would be performed by merely automated means seems based on an anecdotal and unsupported assumption that ‘software tools’ (not further specified) would entirely replace human curation and be capable to detect all the information disseminated via the platform. In fact, there is no evidence in the record before the Austrian courts or the CJEU that such a technology exists as of today – which is no small detail since the availability of a technology of such advanced stage is critical to the finding that searching for content would not create a general monitoring obligation prohibited by law.

The apparent lack of understanding on the technicalities of content removal is probably due to repeatedly applying precedents drawn from copyright protection, despite admitting that fundamental differences exist compared to content regulation. One of such differences resides in that context plays a different role in the two fields: words in a text-based post may or may not be unlawful depending on a number of context-specific factors such as the tone and purpose of it (consider the hypothesis of satire, parody, journalistic reporting as in the ECtHR decision in Jersild, just to mention a few of the most common and likely: just on this point, last week it was reported that YouTube is mistakenly cancelling historical documents of educational value in an effort to tackle hate speech). It remains unclear, following the Opinion, how a platform would be expected to act in case of language that replicated a post found illegal already but in a different context (for instance ridiculing it), or instead just adding a few words to it. It seems evident that, at the current state of art of filtering technology, human intervention would be required to discern when a post needs to be removed, though this is exactly what, in the AG’s reasoning, would turn an admissible monitoring obligation into one that is incompatible with the E-commerce Directive.

Even assuming that such a technology existed, it is not clear how widely available and affordable it would be, which should raise concerns given its fundamental role in the fulfillment of a compelling monitoring obligation. If for instance a content-detecting software was expensive, it may be the case that smaller companies or start-ups could not have the resources to afford and deploy it, as the Commission rightfully flagged for the Court during its submissions at the hearing. Recent case-law of the European Court of Human Rights (cf. Delfi v Estonia; MTE v Hungary; Pihl v Sweden; Tamiz v UK) has accepted that the business model and size of IT companies can justify staggered obligations to remove content; the AG however does not engage with this aspect and has not offered guidance as to whether different user bases and financial capacities would pan out against a one-size-fits-all monitoring obligation, nor there seems to be any engagement on how this would impact, in the context of EU law, on the freedom to conduct business. Overall, the Opinion is understandably very much thought through the lens of what a global giant like Facebook is (expectedly) capable to achieve, though imposing a similar obligation on any company of smaller size would result in a burden impossible to bear.

Similar practical concerns regard the implementation of interim injunctions to block or remove equivalent content. The Opinion delivers little guidance as to how exactly similar a new post should be to the original content to fall under the category of ‘equivalent’ content: the AG suggests that, to his own understanding, this would amount to ‘information that scarcely diverges from the original information or to situations in which the message remains essentially unaltered’, which as a definition seemingly leaves room to still very diverging interpretations by domestic courts and worryingly provides room for abuse, even in spite of examples (such as a ‘typographical error’ or ‘slightly altered syntax or punctuation’) that point in the direction of merely formal, minimal changes. What is most relevant on this point, in any case, is that the Opinion seems to imply that platforms would eventually need to make autonomous and active determinations in regards to what degree of similarity effectively amounts to equivalent content – whatever standard for the concept of equivalence is eventually followed. This however, as the AG highlighted himself, would make the intermediary service provider no longer merely technical, automatic and passive, which in turn would prove incompatible with the E-commerce Directive. This is an apparent contradiction in legal standards that the CJEU will need to resolve, while hopefully considering more in depth its likely potential for over-blocking and negative ramifications for free expression.

Towards global governance of speech?

Finally, for the question on territorial scope the AG moves from the premise that the applicant’s action is not based on EU law and in fact the specific matter of defamation is not regulated by the E-commerce Directive, nor it is possible to justify the territorial effects of an injunction with reference to the protection of personality rights and private life. In fact, the scope of application of the Charter of Fundamental Rights coincides with that of EU law, rather than the opposite, and cannot justify its expansion, and Regulation No 1215/2012 does not cover the effects of injunctions outside the EU. As a result, EU law does not preclude extraterritorial effects of an injunction to remove content. In principle, the Court may decide to follow on the example set in Bolagsupplysningen and apply the single law designated as applicable to the entirety of the application. The AG notes however that, should this approach prevail, its practical implementation would prove difficult, especially in relation to balancing conflicting interests. In fact, the Opinion acknowledges that the legitimate interest in accessing the information removed would likely vary in different countries and result in the right to receive information being compressed in a number of third countries. The preferred outcome would thus be for domestic courts, in the interest of international comity, to exercise self-restraint and limit the extraterritorial effects of their junctions.

The Opinion includes a number encouraging reflections: in particular, as it was suggested in a recent post already, it affirms the need to keep monitoring obligations as narrowly construed as possible; it reflects on the need to exclude platforms from making autonomous determinations on the illegality of identical or analogous content; and at least explicitly hints in the direction that the notions of identical and analogous content cannot be borrowed from the field of intellectual property without taking specificities of content regulation into account; and finally that an expansive approach to territorial jurisdiction would endanger the freedom to receive information in third countries.

There are however some evident limitations in the way these principles are applied to the case: the expectation that technology would be able to achieve something that is quite not possible at present (and still seems unlikely in the foreseeable future) frustrates the AG’s own finding that platforms should only be required to play a neutral role in content monitoring. It is also unfortunate that the AG failed to appreciate in depth that the kind of speech at stake involves not just a case of “ordinary” defamation but it rather concerns political speech and as such needs to be granted a wider degree of tolerance. In fact, the two issues read in conjunction shape a rather worrying picture for the future of freedom of expression on-line. Private platforms with the power to make autonomous determinations on identical and synonymous content and remove it worldwide is a scenario that seems hardly compatible with the European Court of Human Rights’ precedent on political speech and graduated approach to platforms’ responsibilities, and it will be of the utmost importance for the CJEU to correct the direction.

Comments
0
comment
No comments here
Why not start the discussion?