The Czech Republic is one of the last EU member States yet to implement the CDSM directive into national law. The Czech government had produced an initial draft of the implementation act in November 2020, which has since been moving through the legislative parliament. Last week, the Chamber of Deputies of the Czech Parliament adopted the implementation act in third reading and in doing so it has introduced a number of interesting improvements to the implementation of Article 17.
The initial Czech approach to implementing the provisions of Article 17 has been disappointing. The government’s draft bill was largely limited to restating the provisions of Article 17 without introducing any ex-ante safeguards against overblocking or any meaningful remedies for users whose uploads are blocked or removed by automated upload filters. The proposal also included a few positive elements, such as a targeted definition of OCSSPs and an amendment of the existing caricature and parody exception in the Czech copyright act to explicitly include pastiche.
In February of last year, we pointed out the shortcomings of the Article 17 implementation in a position paper that we published together with Open Content, Wikimedia CZ, Iuridicum Remedium and “Za bezpecny a svobodny internet“. In April of this year during the subsequent parliamentary proceedings, Pirate Party MP Klára Kocmanová tabled a set of amendments designed to introduce additional safeguards against overblocking into the text.
During last week’s third reading of the bill in the Chamber of Deputies, these amendments resulted in a much better national implementation of Article 17 that substantially alters the government’s original proposal and incorporates some practical rules for platforms to implement those obligations. This is especially interesting since the Czech Republic is one of the first countries to proceed with its implementation after the CJEU ruling in case C-401/19 has brought more clarity on the safeguards that need to be included in national implementations of Article 17.
The amendments adopted last week modify § 47 — which implements the licensing and filtering obligations from Article 17(4) — and introduce a new § 51(a).
Two new paragraphs were added to the proposed § 47 of the Copyright Act: A new paragraph (4), which implements the first sentence of Article 17(8) and proibits general monitoring of content uploaded by users of online platforms; and a new paragraph (3), which looks like an attempt to integrate the criteria developed by the Advocate General in his opinion in Case C-401/19 directly into the Czech Copyright Act. The latter limits the use of automated content recognition technologies that results in the blocking or removal of uploads to cases where the uploads are “identical or equivalent” to a work that rightholders have requested to be blocked. The Article further defines that (translation via deepl):
Identical content means identical content without additional elements or added value. Equivalent content means content which differs from the work identified by the author only by modifications which can be considered as insignificant without the need for additional information to be provided by the author and without a separate assessment of the lawfulness of the use of the work with modifications under this Act.
This language inserts one of the core findings of the CJEU ruling — that platforms can only be required to detect and block content on the basis of the information provided by rightholders and cannot be required to block content which, in order to be found unlawful, would require an independent assessment of the content by the platforms — into the Czech implementation. While it does so by referencing concepts developed by the AG, instead of the criteria from the final judgement, it is a welcome addition that will offer a better protection to users’ rights than the literal implementation proposed by the government.
The newly added § 51(a), on the other hand, introduces remedies for cases where platforms repeatedly block or remove lawful user uploads. As we have argued previously, one of the most problematic parts of Article 17 is that it includes strong incentives for platforms to remove user uploads (because they otherwise risk being held liable for copyright infringement) but does not clarify what kind of risks platforms face when their use of upload filters violates users’ freedom of expression.
The newly introduced § 51(a) of the Czech Copyright Act would address this issue by giving “legal persons authorised to represent the interests of competitors or users” the ability to request “a ban on the provision of the service” when platforms “repeatedly and unlawfully” block or remove works uploaded by their users (all translations via deepl).
A situation in which platforms risk getting shut down as a consequence of repeatedly curtailing their users’ freedom of speech at the behest of rightholders may be an interesting thought experiment —it would completely reverse the power relationships between platforms and their users —but ultimately shutting down platforms for such violations is a problematic and short-sighted remedy.
Is it really the intent of the Czech lawmaker to shut down a platform as a sanction for blocking or removing lawful uploads and under what conditions would such an extreme measure be imposed? The blocking of an entire platform is clearly counterproductive to the intent of promoting freedom of expression, as it would result in the inaccessibility of all content available on the platform. This makes the remedy introduced in § 51(a) disportionate, which is even more problematic since the article also implies that it can be invoked by competitors.
It is unclear how the drafters of the amendment envisage § 51(a) to work, but it seems clear that — as adopted by the Chamber of Deputies — it will do much more harm than good. While it provides a powerful incentive for platforms not to overblock, invoking this remedy would result in substantial collateral damage that negatively affects the freedom of expression of all other uses of the affected platform.
So what could a more reasonable — and less harmful— remedy look like? What if instead of threatening to shut down the offending platform, § 51(a) threatened to shut down the upload filters instead: If it would prohibit the provision of the automated content recognition (ACR) system for the purpose of blocking or removal of user uploads?
Such a remedy would be much more proportionate, as it addresses the root of the problem (the malfunctioning ACR system) instead of attacking the host (the platform). It would also not cause any collateral damage for other users of the offending platform. It would also be in line with the CJEU’s reasoning as it would prevent filtering systems that do not comply with the standards set by the court from being used.
Shutting down ACR systems that repeatedly overblock would still provide platforms with a powerful incentive to get their systems calibrated right. Given the overall volume of uploads that they have to deal with, platforms do have a very strong interest to be able to rely on automated filters for removing manifestly infringing uploads.
For all of these reasons it seems imperative that the Czech legislator takes another good look at the scope of the remedy introduced in § 51(a). Adding explicit remedies against overblocking is an important element of balanced national implementations, but in its current form the Czech approach risks throwing out the baby with the bathwater.
However, if the scope of the injunctive relief would be limited to banning the continued provisions of overzealous upload filters, the proposed Czech implementation of Article 17 could even become a template for other Member States seeking to bring their implementations in line with the requirements of the CJEU while otherwise staying relatively close to the text of the directive.