A look at the AG Opinion on Article 17

Last week, Advocate General Saugmandsgaard Øe at the CJEU issued his opinion in Case C-401/19, the Polish request to annul Article 17 of the CDSM directive. According to his Opinion, the preventive measures to monitor and block users’ uploads envisioned by Article 17(4) constitute a limitation on the exercise of the right to freedom of expression and information of the users of sharing services, but such a limitation is compatible with Article 11 of the Charter of Fundamental Rights of the European Union, since all the conditions laid down in Article 52(1) of the Charter are satisfied. 

In particular, the Advocate General found that the new liability regime established by Article 17(4) respects the proportionality requirement – despite entailing significant risks for freedom of expression – as it is accompanied by sufficient safeguards to minimise those risks:

  • sharing service providers are not authorised to preventively block all content which reproduces the copyright-protected content identified by rightholders, including lawful content (Article 17(7));
  • those providers are obliged to detect and block only content the unlawfulness of which seems manifest in the light of the ‘relevant and necessary’ information provided by the rightholders (Article 17(8));
  • additionally, and as final safeguard for situations where, despite the obligation in Article 17(7), those providers nevertheless block such legitimate content mistakenly, users have at their disposal a complaint and redress mechanism as well as out-of-court mechanisms (Article 17(9)).

While one could argue that the annulment of this problematic provision would be preferable, in light of the recent decision of the CJEU on Joined Cases C‑682/18, YouTube, and C‑683/18, Cyando, these clarifications on user rights safeguards are very much welcome. The views shared by the Advocate General are, in general, aligned with the arguments brought forward by COMMUNIA and other users’ rights organizations as well as the position held by large group of  academics, and if the CJEU decides to follow the AG Opinion it should force countries that have implemented Article 17 without proper user rights safeguards to reverse course. 

Users of sharing services have subjective rights, and sharing service providers must act diligently in relation to them

One of the first highlights in the Advocate General’s Opinion is the affirmation that the EU legislature has expressly recognised in Article 17(7) second paragraph that users of sharing services have subjective rights under copyright law – which are enforceable against the providers of those services and rightholders – to make legitimate uses of copyrighted materials, including the right to rely on copyright exceptions. Safeguarding the effectiveness of those exceptions and protecting the right of users to benefit from said exceptions implies that sharing service providers need to act diligently not only in relation to rightholders, but also in relation to users. 

As a result, sharing service providers (1) cannot exclude the application of copyright exceptions in their terms and conditions, (2) cannot agree with rightholders that the simple allegation of copyright infringement by the latter will be sufficient to justify the blocking or removal of their copyrighted materials, and (3) are not legally authorised to block or remove content which makes lawful use of copyrighted materials on the ground that such content infringes copyright. 

Systematically blocking ex ante legitimate content is against fundamental freedoms

Secondly, the Opinion clarifies one of the most controversial questions in the debate surrounding the implementation of user rights safeguards: whether those rights need to be protected ex ante, or if it is enough to consider them after the upload has been blocked. 

Advocate General Øe starts by stating that, in a democratic society, rightholders may not demand ‘zero risk’ of copyright infringement, where this would result in blocking “a significant amount of lawful content”. This means that, while preventive filtering and blocking is possible, there must be “a ‘fair balance’ between the effectiveness of filtering and its collateral effect.” 

A system that would systematically impose on users the burden of making a successful complaint in order to post legitimate content online and that would systematically delay the posting of (time-sensitive) uploads until the complaint had been decided would (1) have a ‘chilling effect’ on freedom of information and creation, decreasing the activity of users and (2) risk rendering user uploads irrelevant and of no interest to the public. In the AG’s Opinion, these collateral effects are too great to be compatible with freedom of expression and information. 

Therefore, an interpretation, such as the one put forward by the French and Spanish governments, according to which content could be systematically blocked ex ante, provided that users could obtain its reinstatement ex post, is unacceptable. AG Øe is adamant: “a possible restoration of content is not capable of remedying the damage caused to those users’ freedom of expression.” 

The complaint and redress mechanism and the out-of-court mechanisms envisaged in Article 17(9) are important, but such mechanisms “are not sufficient on their own to ensure a ‘fair balance’ between copyright and users’ freedom of expression”, in the Advocate General’s views. They are merely an additional safeguard for situations where service providers mistakenly block legitimate content. Separately, and cumulatively, Article 17(7) requires service providers to not preventively and systematically block legitimate content.

Content whose lawfulness is ambiguous must be presumed lawful and must not be preventively blocked

Thirdly, according to the Advocate General’s reading, the scope of filtering measures that sharing service providers are required to use must be delimited by the general monitoring prohibition in Article 17(8). Therefore, a sharing service provider may only be required to look for content that is either first been established by a court as being illegal or content which is obviously or manifestly unlawful from the outset, “without, inter alia, the need for contextualisation”. Sharing service providers cannot be required to filter and block content that would require from them an ‘independent assessment’ of their lawfulness, particularly since those are complex legal issues and these service providers do not have the necessary independence “to be turned into judges of online legality”.

This means that sharing service providers are required (and may use automated content recognition technologies) to block only two categories of content:

  • Content which constitutes an identical reproduction, “without additional elements or added value”, of copyrighted content identified by rightholders, and
  • Content which reproduces copyrighted content identified by rightholders “in the same way, but with insignificant alterations, with the result that the public would not distinguish it from the original subject matter” (e.g. format change, speed change, flipped image).

Conversely, content which, while reproducing copyrighted content identified by rightholders, is significantly different from such content cannot be subject to a preventive blocking measure. This includes “all ambiguous situations (…) in which, in particular, the application of exceptions and limitations to copyright is reasonably conceivable”, such as when extracts of works are included in longer content or re-used in other contexts, or with ‘transformative’ works. In those situations, AG Øe argues, the content concerned must be presumed to be lawful and the upload must be permitted.

It is up to the Member States, not to private parties, to find practical solutions to help filters distinguish between ambiguous and manifestly infringing content

Finally, the Advocate General gives his opinion on whether the legislature can delegate the task of monitoring online legality to certain intermediaries without “laying down clear and precise rules governing the scope and application of the filtering measures”. In this point too, the views shared by Advocate General are aligned with what we have previously argued: these practical solutions need to be designed by the Member States and the Commission, and not privately by the service providers or the rightholders. In his opinion, the importance of those practical solutions for users’ freedom of expression is such that the process “should be transparent and under the supervision of public authorities.” The stakeholder dialogue on Article 17 is mentioned by AG Øe as a useful process to find those practical solutions, but since the Commission’s guidelines on the application of Article 17 were published after the drafting of the Opinion, he does not analyse them in detail.

It is worth noting though that the final Commission guidance fails to meet the requirements of the AG Opinion in several respects. The Commission had proposed that the exact parameters for the protection of users’ rights could be agreed between industry stakeholders, and it had proposed far-reaching exceptions to the rule that only manifestly infringing content may be blocked, which the AG rejects outright. He clarifies that, if the “earmark” mechanism proposed in the Commission’s guidelines is to be understood as meaning that sharing service providers should block not manifestly infringing content ex ante “simply on the basis of an assertion of a risk of significant economic harm by rightholders”, he disagrees with it. This very much echoes the main element of our criticism of the Commission’s guidance

Recognizing that Article 17(4) “indirectly” requires sharing service providers to use automated recognition tools “in many situations” to comply with their obligations, the Advocate General points out that the Member States need to find practical solutions to help content recognition tools “distinguish between what seems manifest and what is ambiguous”. These solutions should consist of setting parameters below which automatic blocking of content is prohibited, in particular because the application of an exception “is reasonably conceivable”. The parameters may vary according to the types of copyrighted content and exceptions at issue and will need to take into account “the match rates detected by those tools”. The AG opens the door for the mechanism to lead to individual instances of “false positives” (cases of wrongful blocking) in certain situations, but clarifies that the “false positive” rate needs to be as low as possible and that the number of cases needs to be negligible. 

In his view, a flagging mechanism could also be adopted, in addition to the parameters. Users would be allowed to flag “at the time of or immediately after uploading content whether, in their view, they benefit from an exception or limitation”. The service provider would then need to manually review the upload “in order to verify whether the application of that exception or limitation is manifestly precluded or, on the contrary, whether it is reasonably conceivable”.

In most cases, ECJ rulings tend to follow the AG’s opinion. A date for the delivery of the decision has not been set yet and it is likely that it will take a couple of months before the Court will deliver its decision. Given the importance of the issues at stake here for the ongoing implementation of the DSM directive it seems in everybody’s interest that this happens as soon as possible. 

Several men standing in a bull-fighting arena, one man on a horse
Featured Blog post:
A first look at the Spanish proposal to introduce ECL for AI training
Read more
Newer post
Video Recording of COMMUNIA Salon on the AG Opinion in Case C-401/19
July 22, 2021
Older post
COMMUNIA SALON 4/2021: Article 17: Unpacking the AG Opinion in case C-401/19
July 15, 2021