This week, Politico.eu has shared a “non-paper” prepared by the European Commission on article 13, ahead of the next trilogue on 13 December. The Commission has been tasked during the recent trilogue meeting with proposing a compromise solution on the issue of “mitigation of liability in the absence of a license”, in face of diverging views between the European Parliament and the Council.
In general, any direction on this piece of regulation seems to be lost, with actors participating in the trialogue willing to treat the article like a puzzle, in which puzzles can be rearranged in any way possible – beyond the scope of any previously negotiated and legitimized mandate. The process once again proves to be obscure and lacking with regard to basic rules of participatory policymaking.
The Commission was given several guidelines. These include an assumption that platforms do communicate to the public and need to obtain licenses or that automatic blocking should be “avoided as much as possible”, but is also not forbidden.
Earlier this week, we published four principles, based on which we plan to evaluate the proposed language for article 13. We believe that any version of Article 13 that does not take these four principles into account will need to be rejected in the final vote taken by the European Parliament.
We decided to check the Commission’s proposal, included in the non-paper against our principles. This has been made difficult by the fact that what is proposed in the non-paper is in many ways vague. Once it becomes more substantial, we will be able to make a definitive judgement. But even now, lack of details on some issues – such as protection of content fitting copyright exceptions from overfiltering – is telling.
1. The Scope of application must be as narrow as possible
The Commission proposes a gradual approach that differentiates platforms. The non-paper states that availability of non-licensed, copyright protected content should be achieved through cooperation between platforms and relevant rightsholders. Such cooperation should take into account size of the service, number of uploaded works, potential economic cost, availability of effective and suitable technologies and their cost for service providers.
This is nothing new – such a gradual approach has been present in most versions of article 13. The proposal lacks a stronger limitation, which could be introduced by a reduction of scope – in particular the narrow application that we present in our 1st principle. – The 1st principle is not met. (also note the well known language on “effective and sustainable technologies”, which would again opens the doorway to content filtering by at least some OCSSPs).
2. Do not introduce general liability exposure
This principle is almost met by the Commission’s proposal, which states that platforms “would in general not be liable if they have cooperated in good faith”. Yet the option of general liability is considered in cases, when the sharing of content has caused significant economic harm. – The 2nd principle is not met.
3. Any content filtering and removal must respect user rights
Lip service to this rule is made by the Commission by stating that “Content that does not infringe copyright, for example because it is covered by exceptions, should stay on the services’ websites”. We are not satisfied by such a statement, which lacks any detail about measures that will encourage – or force – platforms and rightsholders cooperating on removal of infringing content to seriously treat exceptions. “Robust redress mechanism”, which gives users ability to contest measures are not enough. We believe that meaningful damages for unjustified content removal or blocking are necessary. The 3rd principle is not met.
4. Measures must be transparent and accountable
The Commission does not propose any measures that provide transparency or accountability. To the contrary the commission’s proposal introduces arbitrary categories like “high” and “low” value content that would make any blocking or removal even less transparent to users and creators. – The 4th principle is not met.
So our overall judgement is clear – the Commission’s proposal does not meet the four criteria that we have defined.
Towards a User Generated Content Exception?
There is one additional issue worth pointing out, which related to something that we have been calling for since the beginning of the reform process. The need for a User Generated Content exception.
The non-paper states that the “minor uses of content by amateur uploaders should not be automatically blocked”. This is a curious statement for a document that purportedly does not recommend automatic filtering (why then propose an exception to automatic filtering?). Even more interestingly though this could be read as an indirect acknowledgement of the need for a UGC exception. If the Commission really wanted to give “amateurs” that make “minor uses of content” legal security, then it should support the proposal to introduce an UGC exception. Such an exception, when properly enforced, would give users a much stronger protection than what the Commission now proposes.