Vrouw die een stier tracht te bedwingen

Article 13: Four principles for minimising harm to users, creators and the internet

Later today the negotiators of the Commission, the European Parliament and the Council will meet for the 4th trilogue meeting. After having dealt with less controversial parts of the proposal during the three preceding meetings, tonight, will finally see a discussion about Article 13 of the proposed DSM directive.

Given that all three legislators bring similar versions of article 13 to the table, we can expect that a final compromise text will include some version of the article 13 upload filters. There is still a good chance that the negotiations will be inconclusive or that the eventual outcome of the trilogue negotiations will not be approved by either the Member States or the Parliament (which would mean that the directive will fail and there will be no upload filtering requirement for the foreseeable future). But in the context of the ongoing trilogue, the deletion of article 13 (which has been our position so far) is not an option anymore.

This raises the question of how the damage that article 13 will do to the internet ecosystem and freedom of expression can best be contained. Before we do so let’s take a quick look at the positions that are on the table:

EP position: general blocking of all unlicensed content

The provision adopted by the European parliament can only be described as a total disaster. As the result of a misguided attempt to remove the mention of “measures” from the text of the article the European Parliament adopted a version of article 13 that makes platforms liable for copyright infringements for every single work uploaded by their users. This would include any photo, drawing or text uploaded by a user, regardless if these are old works, works that have been created for the express purpose of being shared widely, or the latest blockbuster movie. As a result of making platforms liable for all works uploaded by their users, they are practically forced to install filters that will block everything that has not been licensed to them. In other words, the EP version of article 13 would turn open platforms into platforms that distribute content licensed by the entertainment industry and nothing else.

This idea is so dangerous that even MEP Cavada, who is no friend of the platforms or their users, has criticised it during the negotiations in the European Parliament that it…

.. aligns the liability regime of platforms on the one of traditional broadcaster and therefore imposes an absolute liability in terms of copyright, which is a non-sense. This situation would lead to a general blocking, and by default, of all copyright protected content not covered by a licence, even without a request from rightholders.

Yet this is the position that the European Parliament brings to the trilogue table. On the positive side (which in no way balances out the fundamental flaw of unlimited liability) the EP position contains relatively strong safeguards for users whose uploads are wrongfully filtered and has stronger carve outs for services that are excluded from the filtering requirement.

Council position: Implement upload filters or be liable

The position of the Member States makes much more sense, at least technically (unless we assume that it really is the intention of the EP to turn internet platforms into pseudo broadcasters, in which case the EP proposal also makes sense). The council text also makes platforms liable for uploads of their users, but gives the platforms a chance to limit this liability by implementing upload filters. These filters would need to block (and subsequently suppress) all unlicensed materials identified by their rightsholders. If platforms can demonstrate that they have “effective and proportionate” filters in place that “prevent the availability of the specific works identified by rightholders” and that they act “expeditiously to remove or disable access” to infringing works upon notification platforms would not be held liable for infringements of works that have been uploaded without authorisation. In other words, the Council text requires platforms to filter works that are in the catalogues of rightsholders, but allows them to leave other works online unless they are notified of infringements.

This much more targeted approach is still hugely problematic. As we have pointed out previously, filtering technology is simply incapable of recognising whether a work is legally used on the basis of an exception to copyright. While the Council text requires that the filters are “implemented by the online content sharing service provider without prejudice to the possibility for their users to benefit from exceptions or limitations to copyright” this remains wishful thinking given the current state of technology. Faced with liability platforms will almost certainly opt to err on the side of overfiltering, which will result in structural limitation of users freedom of (creative) expression.

Our position: Upload filters remain a terrible idea (but these four principles would avoid the worst effects)

As the negotiators try to find a compromise between the positions of the Parliament and the Council here are four principles for limiting the worst damage that upload filters would do to the online ecosystem and the rights of internet users in the EU. All four can be implemented without changing the overall structure of the various versions of Article 13 currently on the table.

1. The Scope of application must be as narrow as possible

We have argued again and again that given the intention to provide more leverage for rightsholders vis-a-vis the big commercial media platforms the legislator should make sure that article 13 only affects them and not every single platform that allows user uploads. To achieve this the definition of the services affected (the so called “OCSSPs”) must be as narrow as possible. Currently both the EP and the Council text contain an open ended definition combined with a limited list of exceptions. This approach is overly broad and innovation hostile (services that don’t yet exist cannot, by definition, be included in the list of excluded services). The legislator should limit the application of article 13 to “for-profit audio visual platforms that compete with licensed audio or visual services”.

2. Do not introduce general liability exposure

As we have outlined above the European parliament’s version would make platforms liable for all works that are uploaded by their users. This would include works that cannot be licensed by the platforms (for example because they are out of commerce or because they are not considered to be works by their creators or because the creators simply cannot be bothered to license them). From the perspective of platforms, risking to be held liable for infringements is too risky and this will mean that platforms will be forced to block all copyrighted works uploaded by their users for which they do not have a license. This outcome benefits no-one and would severely limit the freedom of creative expression of millions of European internet users. To prevent this the liability of platforms must be limited to those works that they can actually license (as it is the case in the Council text).

3. Any content filtering and removal must respect user rights

The biggest problem with a requirement to implement upload filters is that these filters are only capable of identifying works. Filters are simply not capable of determining if a particular use of a work is infringing or if it is allowed under  exception or limitation to copyright. Yet both the Council and the Parliament make it clear that any measures implemented need to respect the rights users have under exceptions and limitation. Given the current state of technology this simply amounts to wishful thinking. Both Council and Parliament include redress mechanisms that would allow users to challenge unjustified blocking or removal of their uploads. Given that overfiltering will happen by design it is not acceptable to put the burden to rectify this on users who can only act after the their rights have been violated. To ensure that measures respect user rights platforms and rightsholders must both face meaningful damages for unjustified content removal or blocking. Neither the Council nor the EP version currently meet this condition. Given the current state of technology that would mean that they cannot rely on automated blocking or removal and will need to ensure that platforms are fully licensed.

4. Measures must be transparent and accountable

The way that article 13 is structured means that copyright enforcement and the safeguarding of fundamental user rights are left to private entities (rightsholders and the online platforms). Privatizing enforcement and rule setting in the hands of for profit entities undermines the idea of an open and democratic digital media space. To ensure that the measures are not abused by rightsholders and platforms, users and creators must have full transparency regarding any blocking or removal of content by platforms. In the interest of full transparency all measures should be based on publicly accessible repertoire information that is available to all platform operators. This ensures that rightsholders can be held accountable for unjustified blocking or removal as a result of faulty repertoire claims and that all platforms have access to the same repertoire information. Neither the Parliament nor the Council text currently include language that would ensure a sufficient level of transparency for all parties involved.

Unless these four principles are taken into account any version of Article 13 will need to be rejected when it comes back for a final vote to the European Parliament.

A final chance to minimise unnecessary harm

During tonight’s trilogue negotiations lawmakers should take these four principles into account. Instead of insisting on the flawed texts that they brought to the table, they should attempt to achieve a text that causes minimal harm, to users, creators and the internet ecosystem as a whole. To achieve this they need to make sure that any obligations under article 13 only affect the services they are intended to affect and only concern works from rightsholders who actually intend to license their works to the platforms. Lawmakers also need to make sure that measures implemented fully respect user rights and are fully transparent. Anything falling short of this would result in a legislative measure that will both fail to achieve its objective and will cause substantial collateral damage for the internet ecosystem in the EU. It will further underline the point that the EU copyright framework is about protecting legacy business models of a few at the expense of freedom of expression and innovation.

An etching and engraving of capterillars, butterflies and flower by Maria Sibylla Merian (cropped).
Featured Blog post:
New publication: A Digital Knowledge Act for Europe
Read more
Newer post
Trilogue: don’t give up on fundamental rights!
December 6, 2018
Older post
SCCR/37: COMMUNIA general statement on exceptions and limitations
November 28, 2018