Specific Monitoring Obligation

 

 

A lot has been said about Article 13 since the European Commission published its Copyright Directive proposal. However, much of this is based on fear mongering and false interpretations of the existing texts provided by the European Commission, European Parliament and the Council. Moreover, considering that all three institutions take similar lines on this issue, based on their legal analysis, debates and legal service opinions, the credibility of such fear-mongering by organisations funded by the concerned platform services seems to be questionable, at the very least.

 

 

What is required by Online Content Sharing Service Providers (OCSSP) is "cooperation", which already applied to their competitors on the digital market that operate on a legitimate basis

 

Cooperation will enable an effective and targeted monitoring of the uses of the protected content, which in turn will enable an accurate distribution of remuneration to authors and rights holders. Additionally, these services are obliged to cooperate on measures to prevent works from appearing on their platform, should they not want an agreement with the rights holder.

 

 

Licensing is always the preferred option for authors’ societies

 

The obligation to cooperate gives better possibilities to keep content available while ensuring appropriate remuneration for creators. Authors' societies are keen to make more works available. Since they represent up to 30 million works and counting, trying to block them is neither a realistic nor a desired option.

 

 

Monitoring will happen based on what rights holders provide

 

Article 13 of the Copyright Directive proposals only require matching the data that is provided by the rights holders against the data uploaded on its platform. That excludes random or total filtering and instead enables very focused content recognition tools that enable the legitimate use of works on the platform by leading to less unlawful blocking.

 

 

Previous court rulings addressed outdated and inappropriate methods

 

The European Court of Justice (CJEU) in its previous cases (i.e. Netlog ruling) noted that the technologies required to check all data uploaded by its users, and then to find out if any of it was unlawful, and then block relevant users, would potentially impact freedom of expression, since such a system could involve the risk of also blocking data that was lawful: the service (Netlog) was alone in executing this task and finding the rights status and facts regarding all content on its platform.

 

Such a system bears no comparison with modern content recognition systems of the kind contemplated by Article 13 (and already in use by some service providers). Indeed, Article 13 of the Copyright Directive proposal only requires matching the data that is provided by the rights holder against the data uploaded on its platform. Since the data will be provided by the rights holder, the service does not need to find facts regarding the status of the content on copyright matters, and it will be a targeted use as the service will only need to match the relevant data.

 

Since cooperation plays a key role, and licensing enables the legal existence of the content on the platform, any remaining risk of harming the freedom of expression is also eliminated. If content is in fact removed unrightfully, appropriate and functional redress mechanisms are provided for. This is an essential safeguard for consumers vis-à-vis platforms that tend to block more content than they should, just to avoid any potential legal action, with complete disregard for their consumer's freedom of expression. It is also to be noted that consumers today have no legal recourse against platforms regarding the wrongful removal undertaken by platforms.

 

  • For a further in-depth analysis of relevant CJEU cases and of the compatibility with the acquis communautaire, read more here.