Specific Monitoring Obligation

 

 

A lot has been said about Article 13 since the European Commission published its Copyright Directive proposal. However, much of this is based on fear mongering and false interpretations of the text.

 

 

 

 

 

The obligation set by Article 13 is that of cooperating with rights holders who wish to do so for the use of their works

 

There are services that can be covered by the E-Commerce Directive (ECD) safe harbour provisions but which are still subject to certain rules under Article 13 of the Copyright Directive proposal. These are services that store and provide access to “large amounts” of works.

 

Cooperation will enable an effective and targeted monitoring of the uses of the protected content, which in turn will enable an accurate distribution of remuneration to authors and rights holders. Additionally, these services are obliged to cooperate on measures to prevent works from appearing on their platform, should they not want an agreement with the rights holder.

 

 

Licensing is always the preferred option for authors’ societies

 

The obligation to cooperate gives better possibilities to keep content available while ensuring appropriate remuneration for creators. Authors' societies are keen to make more works available. Since they represent up to 30 million works and counting, trying to block them is neither a realistic nor a desired option.

 

 

Monitoring will happen based on what rights holders provide

 

Article 13 of the Copyright Directive proposal only requires matching the data that is provided by the rights holders against the data uploaded on its platform. That excludes random or total filtering and instead enables very focused content recognition tools that enable the legitimate use of works on the platform and for fewer unlawful blocking.

 

 

Previous court rulings address outdated and inappropriate methods

 

The European Court of Justice (CJEU) in its previous cases (i.e. Netlog ruling) noted that the technologies required to check all data uploaded by its users, and then to find out if any of it was unlawful, and then block relevant users, would potentially impact freedom of expression, since such a system could involve the risk of also blocking data that was lawful: the service (Netlog) was alone in executing this task and finding the rights status and facts regarding all content on its platform.

 

Such a system bears no comparison with modern content recognition systems of the kind contemplated by Article 13 (and currently in use by some service providers). Indeed, Article 13 of the Copyright Directive proposal only requires matching the data that is provided by the rights holder against the data uploaded on its platform. Since the data will be provided by the rights holder, the service does not need to find facts regarding the status of the content on copyright matters, and it will be a targeted use as the service will only need to match the relevant data.

 

Since cooperation plays a key role, and licensing enables the legal existence of the content on the platform, any remaining risk of harming the freedom of expression is also eliminated. If content is in fact removed unrightfully, appropriate and functional redress mechanisms are provided for. This is an essential safeguard for consumers vis-à-vis platforms that tend to block more content than they should, just to avoid any potential legal action, with complete disregard for their consumer's freedom of expression. 

 

  • For a further in-depth analysis of relevant CJEU cases and of the compatibility with the acquis communautaire, read more here.