Copyright Filtering Part II: Article 17 DSM Directive and German Copyright Law

Lawmakers in the Member States are busy drafting their bills for the  implementation of the DSM Directive in national copyright laws, as the respective deadline of 7 July 2021 is approaching. In this publication we look at the German BMJVs revised and consolidated version of the draft bill.

The passionate debate about Article 17 DSM Directive and (presumably) compulsory “upload filters” continues unabated.

Earlier this year, the German Federal Ministry of Justice and Consumer Protection (‘BMJV’) published its first discussion draft bill for this implementation (as reported in our previous article), and Minister of Justice Christine Lambrecht promisingly declared back then that upload filters will “widely expendable” under her proposal.

In reaction to the diverse feedback received, the BMJV has now presented a revised and consolidated version of the draft bill – but with the changes made, it will quite clearly fail to live up to the Minister’s promise.

Recap on Article 17 DSM Directive and the BMJV’s first draft

Article 17 DSM Directive prescribes a new liability regime for ‘online-content sharing service providers’, setting out that certain online platform providers shall be directly liable for the dissemination of copyrighted content that is uploaded by their users. Platform operators are to seek permission, e.g. through licensing agreements – and where they do not, they must take preventive measures in order to avoid liability. While not explicitly imposed, it is commonly understood that this regime will require the installation of content recognition and "fingerprinting" technologies, which would in turn entail both legal and practical challenges for platform operators.

The BMJV was clearly aware of these challenges when publishing its first draft for the Directive’s implementation (in a new ‘Copyright Service Provider Act’, abbreviated in German as ‘UrhDaG’), and tried to creatively circumvent a mandatory requirement of upload filters through a pre-flagging mechanism: Instead of monitoring uploads, platform providers were to inform their users in the upload process about the need for permission and would allow them to label their content as "permissible". Content labelled "permissible" would not need to be removed or blocked (unless the labelling were evidently incorrect), and the platform operator would not be held liable for such content.

That approach would indeed have resulted in a modified notice-and-takedown procedure rather than a filtering obligation, and was therefore generally welcomed by platform operators. At the same time, content owners and right holder associations unsurprisingly expressed discontent, arguing that it failed to strengthen platform liability as intended under Article 17 DSM Directive.

Revisions of the BMJV’s second draft

Reacting to stakeholder criticism, the BMJV has meanwhile published its second draft for the Copyright Service Provider Act (see here, p. 29 ff). This second draft comes with various clarifications and certain punctual amendments. The changes made to the pre-flagging mechanism in Sec. 8 of the Act seem minor at first glance, but on closer inspection it becomes clear that they alter the envisaged liability regime significantly, with major practical implications:

  • Sec. 8 (1) UrhDaG now sets out that platform providers must enable pre-flagging by their users (only) if the right owner has already filed a blocking request for the content in question. Where this is the case, the provider must inform the user about the blocking request immediately. If the user then flags the content as “permissible”, the platform provider has to likewise inform the respective right owner immediately as per Sec. 8 (2) UrhDaG.

These changes are critical both from a legal and a practical view: While the new wording significantly reduces instances were pre-flagging must be made available compared to the previous draft (which would have applied to any upload), the platform algorithms would now need to know under the new draft whether or not any uploaded content falls under a blocking request. In other words, there must now be a “Pre-Checking” before “Pre-Flagging”. This would effectively require the platform provider to use content recognition technologies through the back-door – and since the user is to be informed immediately, the algorithms would arguably have to work in real-time.

  • Sec. 8 (2) UrhDaG now additionally sets out that content must be made available by the platform if it is flagged as “permissible” (and where this flag is not evidently incorrect).

This goes beyond the first draft: The previous version only set out that the right owner could not readily request removal or blocking of flagged content, but left room for the platform provider to refuse publication of the content for other reasons. The new provision now reads as though it could grant the uploader an enforceable “right to display” against the platform operator. But such a right would be in stark conflict with the provider’s obligations to remove content that is unlawful beyond copyright – such as trademark infringement, violation of personality rights, or terrorist content. It its attempt to prevent overblocking, it seems that the BMJV has not really thought through how the new law would be applicable in its intersection with other content moderation obligations imposed on providers under the eCommerce Directive and various other EU and national laws –  soon be increasingly harmonised under the Digital Services Act

Furthermore, given that content must be removed if its labelling as permissible is evidently incorrect, arguably the efficiency element of letting the uploader confirm permissibility is quite lost: The provider would under the new wording need to double check the copyright compliance of the content in any event, in order to satisfy the obligations it has vis-a-vis the right owner.

With these changes, the draft now proposed clearly falls short of Minister Lambrecht’s earlier promise: It is hard to think of a way how platform providers can comply with the above obligations without implementing content recognition technologies – even though they are not explicitly prescribed. Unsurprisingly, the new draft has already received severe criticism for this very reason by user rights activists and opposition politicians, who see their view confirmed that Article 17 DSM Directive will inevitably lead to upload filters through the back door (see e.g. here).

Outlook

‘To filter or not to filter?’ On the basis of the BMJV’s latest draft, the answer – while not expressly spelled out – is nevertheless quite obvious: Platform providers would have to filter uploaded content.

Given the various deficiencies of the second draft, we can expect – and hope for – further amendments. The parliamentary readings of the draft in the German Bundestag have not yet been scheduled, but given tight implementation deadline of 7 July 2021, there is not much time left for the lawmakers in Germany (and elsewhere in the EU) to find common ground.

Stay tuned for our articles on further developments around the implementation of the DSM across the EU and on the upcoming changes under the Digital Services Act!

Authored by Anthonia Ghalamkarizadeh and Florian Richter

 

This website is operated by Hogan Lovells International LLP, whose registered office is at Atlantic House, Holborn Viaduct, London, EC1A 2FG. For further details of Hogan Lovells International LLP and the international legal practice that comprises Hogan Lovells International LLP, Hogan Lovells US LLP and their affiliated businesses ("Hogan Lovells"), please see our Legal Notices page. © 2024 Hogan Lovells.

Attorney advertising. Prior results do not guarantee a similar outcome.