Print of view of the Osaka Imperial University campus in the Nakanoshima Ward (cropped) by Akamatsu Rinsaku.

The Post-DSM Copyright Report: Article 17

This is the final blog post in our series on trends in national copyright policy following the DSM implementation. With the national processes finalised in all but one Member State, we have analysed the transpositions to understand the impact of our proposals, identify areas of convergence and divergence, and assess how those affect the position of users and the Public Domain. In previous blog posts, we discussed the national transpositions of research rights, education rights, the Public Domain safeguard and cultural heritage rights, and the press publishers’ right. Today, we present the results of our comparative analysis on the implementation of Article 17.

Background

Article 17 changed the liability rules for most for-profit content sharing platforms (“online content sharing services providers” or “OCSSPs”), which are now considered to be directly liable for copyright infringing content uploaded by their users and made publicly available on their platforms. As a result, they need to make best efforts to obtain authorisations from rights holders and, if they fail to obtain such authorisations, they need to take a set of steps to be exempted from liability, such as actively carrying out prior checks on users’ uploads for possible infringements (Article 17(4) b) and c) in fine).

Copyright content moderation often requires the use of automatic content recognition and filtering tools. Existing tools are efficient at identifying content, but incapable of understanding the context in which content is used and, therefore, often fail to recognise perfectly legitimate uses, such as quotations and parodies. In order to mitigate the risks to freedom of expression and the right to information, the European legislature established certain ex-ante and ex-post users’ rights safeguards in Articles 17(7) to (9).

COMMUNIA’s main priority during the national transpositions of Article 17 was to ensure that Member States would provide, in their laws, a mechanism giving practical meaning to those users rights safeguards. For this reason, our comparative analysis focuses only on the implementation of paragraphs 7, 8 and 9. For an examination of other elements of Article 17, we recommend this report by Christina Angelopoulos.

Most Member States implement Article 17 without specific ex-ante users rights safeguards

From a users’ rights perspective, one of the most important elements of any implementation of Article 17 is how platforms can reconcile the conflicting requirements to filter infringing content (17(4) b) and c) in fine) and to ensure that legal uploads are not filtered out (17(7) and (8)).

The Directive does not define the specific measures that platforms should adopt to ensure that legitimate uses remain unaffected by the deployment of content filters in the context of Article 17(4) b) and c) in fine. Yet, as foreseen in Article 17(10), the Commission has issued guidance on Article 17, laying down mechanisms that are intended to flesh out the abstract requirements.

In spite of that, most national implementations remain silent on this crucial question, closely following the wording of Article 17. Out of the 26 Member States analysed, only five incorporate specific ex-ante safeguards targeted at ensuring that automated content filters do not block legitimate uploads:

  • Implements Article 17 with specific ex-ante users rights safeguards: AT, BG, CZ, DE, SE
  • Implements Article 17 without specific ex-ante safeguards: BE, CR, CY, DK, EE, FI, FR, GR, HU, IE, LV, LU, MT, NL, PT, RO, SK, SL
  • Does not implement Article 17(7): IT, ES, LT

Disputed uploads required to stay down in 3 Member States

During the discussions surrounding the national implementation of the Directive, some defended that content could be systematically blocked, provided that users could obtain its reinstatement ex-post.

In April 2022, the CJEU issued a ruling, rejecting the position that it was enough for platforms to only restore legitimate content ex-post, once it has been blocked. The Court clarified that users shall have at their disposal an ex-post complaint and redress mechanism as well as out-of-court mechanisms when their legitimate content is blocked (Article 17(9)). However, platforms have an obligation of result not to block lawful content at the time of upload (Article 17(7)). The Court further added that Article 17(8) permits automatic filtering and blocking of content but only insofar as no independent assessment is required.

By the time the CJEU issued its ruling, Italy, Spain and Lithuania had already implemented Article 17 into their laws, requiring disputed uploads to stay down during the complaint procedure. These Member States are yet to bring their laws into compliance with the standards set by the CJEU.

Rules governing the application of filters set in 5 Member States

In the 2022 ruling, the CJEU clarified that the filtering system must be able to “distinguish adequately between unlawful content and lawful content” and must not lead to the blocking of lawful uploads. While it is clear that this requires platforms to implement ex-ante safeguards to minimise the risk of blocking of lawful uploads, it is unclear how those measures should operate in practice.

Lawmakers in five Member States decided to go a step forward and design specific rules intended to ensure that the preventive measures employed to monitor users’ uploads do not block legitimate uploads. The criteria formulated by national lawmakers to prevent automated blocking of lawful uploads varies. Some laws set precise rules and others resort to general and abstract requirements to define the categories of content that cannot be automatically blocked:

  • Content that fulfils minimum thresholds or is pre-flagged by users: AT, DE
  • Content that is not identical or equivalent to copyrighted content: CZ
  • Content that is not likely to infringe copyright with a high degree of probability: SE
  • Content that is not infringing: BG

The German law prohibits automated blocking of uploads “presumably authorised by law.” For an upload to qualify as “presumably authorised by law,” it needs to fulfil certain cumulative criteria defined in the law.1 If the use exceeds the legal thresholds or generates significant revenues, the user is still allowed to flag it as being covered by an exception, in which case it cannot be automatically blocked. These rules do not apply to the use of films or moving images until their first public display has been completed. “Trusted right holders” may still request immediate removal if they consider the use evidently infringing and commercially harmful.

A similar approach is followed by Austria, which carves out “uses of small parts of a work”2 from the obligation to prevent the availability of protected works. Like in Germany, uploaders have the ability to flag uploads as legitimate. In line with the Commission’s implementation guidance for Article 17, these provisions do not apply to commercially valuable works.

The Czech law takes inspiration from the Advocate General’ Opinion and only allows the application of blocking measures to content that is “identical or equivalent” to the matched work. These concepts are defined in the law.3

Sweden only permits automated blocking if the upload “is likely to infringe copyright with a high degree of probability”, whereas Bulgaria prohibits platforms from blocking or removing any content which does not infringe copyright and related rights. None of these laws formulate any criteria to help determine cases where the content concerned must be presumed to infringe or to have a high probability of infringement of copyright.

Additional ex-post safeguards against overblocking

In order for users to understand the impact of the application of Article 17 on users’ freedom of expression, and to identify instances of structural overblocking, they need to have access to information on the platforms’ content moderation practices. Equally important for the enforcement of users rights is the possibility of users organisations to act against platforms that repeatedly remove or block legitimate content. While the DSM Directive does not impose a transparency obligation towards users, nor does it foresee collective redress mechanisms for users in context of Article 17, those measures have been identified in a few national laws:

  • Transparency obligation towards users:4 AT, LT, SE
  • Collective redress mechanisms for users: DE, HU, SE

Another area where the Directive remains silent are sanctions in case of abuse of the cooperation mechanism established by Article 17. These types of measures are important to create an incentive against overblocking and against abusive provision of false ownership claims. While most Member States ignored these issue, a few decided to include those extra safeguards for the protection of users rights:

  • Measures against abuse by platforms: AT, CZ, EE, FI, DE, SE
  • Measures against abuse by right holders: DE, FI, LT

In Austria, the supervisory authority may apply a fine to sharing platforms that block “systematically and to a considerable extent” legitimate uploads, whereas the Estonian law authorises the supervisory authority to take actions against platforms that fail to comply with their obligations to protect substantive and procedural user rights, including applying a non-compliance levy.

In Germany, a user association may claim injunctive relief against a platform that repeatedly blocks lawful uploads. Similarly, in Sweden a user or a users organisation can initiate court proceedings against platforms for failure to meet their obligations to protect lawful uploads. In addition, under Swedish law, users are also entitled to damages for intentional or negligent failure of the platforms to protect their rights. In Finland, users have the right to initiate court proceedings against platforms to obligate them to enable access to lawful uploads. Under the Czech implementation law, a consumer organisation or an entity representing the interests of competitors may demand a ban on the provision of service, if a platform repeatedly or wrongfully blocks or removes lawful user uploads. While this measure provides a powerful incentive for platforms not to overblock, the remedy is clearly disproportionate and counterproductive, as invoking it would result in substantial collateral damage that negatively affects the freedom of expression of all other uses of the affected platform, the inaccessibility of all content available on the platform.

Abusing blocking requests by right holders or alleged right holders will lead them to be excluded from the procedures provided by the German law (notice-and-takedown, notice-and-staydown and earmarking mechanism) for an appropriate period of time.5 If an alleged right holder intentionally or negligently demands the blocking of a third-party work as his own work or a public domain work, he is also obliged to compensate the platform and the user for the resulting damage. In Lithuania, right holders that abuse the automated content identification technologies or that make “malicious or unfounded complaints” can also be subject to sanctions, but the law does not define these. In Finland, the user and the platform have the right to initiate court proceedings against the right holders for damages resulting from an unjustified blocking request.

Conclusion

Copyright content moderation on OCCSPs is currently subject to a fragmented legal framework. Our analysis of the national implementations of Article 17 reveal three different approaches to users rights, which translate into a multiplicity of national regimes across the EU:

  • national laws that flesh out Article 17(7) to (9), imposing substantive ex-ante measures on sharing platforms to protect users rights at upload, together with complementary ex-post procedural safeguards to sanction its removal;
  • national laws that re-state the text of the Directive, leaving to platforms to decide which measures to implement to give effect to Article 17(7) to (9), and to courts to interpret if those measures are balanced;
  • national laws that fail to implement Article 17(7), requiring instead that platforms remove disputed uploads and only restore legitimate content ex-post.

The last set of laws clearly goes against the CJEU case law, and should be challenged by the Commission. The Court clarified that automatic filtering and blocking of content should be limited to situations where no independent assessment is required. This means that disputed uploads need to stay up until a human review has been concluded.

As to the national lawmakers that merely restate the provisions of the Directive, we believe they are not fulfilling their obligation to protect competing fundamental rights. The national laws may not have manifest compliance issues with the Directive, but it is clear that they are failing the users of sharing platforms, by delegating copyright content moderation to intermediaries without legislative criteria to achieve a balance of interests. Platforms will have the power to determine when the upload must be permitted and, in the absence of sanctions against overblocking, they will have little incentive to protect users’ freedom of expression and freedom to impart information. In fact, the vast majority of disputed uploads will never reach the court, not least because judicial enforcement of users rights is absent from most national laws.6

From COMMUNIA’s perspective, Member States that have introduced specific ex-ante measures to prevent overblocking, together with sanctions against abuse of the cooperation mechanism established by Article 17(4) b) and c) in fine, are better prepared to address the risks that the use of upload filters entails for users of sharing platforms. The German model, in particular, lays down clear and precise rules on the application of automated filtering technologies that help platforms navigate fundamental freedoms. This model is one of the most ambitious implementations of Article 17 and should be a source of inspiration for other Member States wishing to set a higher standard for the protection of users rights and a better balance between the effectiveness of filtering systems and freedom of speech.

Endnotes

  1.  The use must consist of less than 50% of the original protected work, must combine the parts of the work with other content, and must be minor (a non-commercial use of less than 15 seconds of audio or video, 160 characters of text or 125 kB of graphics). See §9-10 Urheberrechts-Diensteanbieter-Gesetz – (UrhDaG).
  2. The Austrian law defines similar cumulative criteria. See §89b(3) Urheberrechtsgesetz.
  3. “Identical content” is defined as content “without additional elements or added value” and “equivalent content” as content that contains only immaterial modifications “without the need for additional information to be provided by the author and without an independent assessment of the legitimacy of the use”. See §47(3) Zákon č. 121/2000 (autorský zákon).
  4. It should be noted that, while Germany does not extend the transparency obligation towards users, it requires platforms to give access to data on their content moderation practices to researchers.
  5. A user that repeatedly incorrectly marks a use as permitted can also be excluded from the opportunity to flag permitted uses for a reasonable period of time.
Cropped etching of Minerva with a helmet, featuring event title and information.
Featured Blog post:
Upcoming Event: Why Europe needs a Digital Knowledge Act
Read more
Newer post
Sign up for our new monthly newsletter!
May 13, 2024
Older post
COMMUNIA 2022-2023 annual report out now
May 10, 2024