Article 17 of the EU Copyright Directive (2019/790) and its Impact on Human Rights

Written by Krishen Soogumaran, LLB, University of Hertfordshire.
Edited by Chrystal Foo.
Reviewed by Florence Yeap Xiao QingLuc Choong and Celin Khoo Roong Teng.

Article 17 of the European Union Directive on copyright and related rights in the Digital Single Market (2019/790) sets out strict obligations on online intermediaries to prevent the uploading of copyright-infringing works. This requires intermediaries to either obtain authorisation from rightsholders, or in lieu of that, ensure unavailability of infringing works through content-filtering mechanisms. However, filtering mechanisms are not faultless, and thus, the adoption of such technology could have adverse effects on human rights, such as the freedom of expression, amongst others. 

I.   ARTICLE 17: INTRODUCTION
Article 17 of the European Union (EU) Directive on copyright and related rights in the Digital Single Market (2019/790) (DSMD) is part of a larger initiative to modernise the EU copyright rules.[1] Following the advent of Web 2.0 technologies, among the issues faced by rightsholders was that intermediaries were relieved from liability for any infringing works uploaded onto their platforms.[2] This allowed them to continue generating profit from said works, mainly through advertising revenue. In addition, rightsholders were poorly remunerated by intermediaries when compared to the revenue generated by their content. This came to be known as the ‘value gap’.[3] Article 17 of the DSMD aims to solve this by placing strict primary liability on intermediaries for infringing content uploaded on their platforms.[4] Liability can be relieved by either obtaining authorisation from rightsholders for the communication of copyrighted work, i.e., licensing agreements, or by ensuring unavailability of the infringing work from the outset, i.e., using upload filters,[5] or proving that said infringing work falls under a copyright limitation.[6] These limitations may be further developed on a national level for intermediaries to rely on.

Article 17 of the DSMD targets online content-sharing service providers (OCSSP), who store and give the public access to a ‘large amount’ of copyright-protected works for profit.[7] The commission expressed that only OCSSPs playing an ‘important role’ should be affected.[8] However, there is existing ambiguity behind the terms used in the provision, i.e., ‘large amount’ and ‘important role’.[9] As such, difficulty arises in ensuring compliance since OCSSPs are unable to determine their eligibility and prepare for the obligations under Article 17 of the DSMD. Consequently, this could affect their freedom to conduct business.[10] Although the European Commission (EC) has clarified that an individual assessment on the eligibility of OCSSPs will be conducted, it still leaves OCSSPs awaiting on said assessment.[11] Thus, even before discussing filtering obligations, it is submitted that Article 17 of the DSMD has already encroached into a fundamental right.
 
Paragraph (4)(b) of Article 17 of the DSMD provides that once authorisation is unattainable despite best efforts, an OCSSP may take best effort to ensure the unavailability of infringing works in accordance with the high industry standards of professional diligence.[12] If said works do appear, paragraph (4)(c) obliges OCSSPs to act expeditiously in disabling access or removing infringing works upon notice from rightsholders.[13] It is important to note that these obligations only apply once rightsholders have provided the necessary information.[14]
 
In further analysis, paragraph (4)(b), by implication, requires the installation of a filtering mechanism.[15] Though the implementation of a content filter enforces intellectual property (IP) rights, it also affects fundamental human rights, leading to the need for this analysis. The aforementioned human rights are enshrined under the Charter of Fundamental Human Rights of the European Union (Charter).[16] The Court of Justice of the European Union (CJEU) has recognised and expressed the significance of said effect, indicating that member states must strike a ‘fair balance’ between human rights and IP rights when transposing directives.[17] This article aims to critically analyse whether Article 17 balances the protection of IP rights with competing human rights at stake.

II.  RESPECT FOR PRIVATE AND FAMILY LIFE AND PROTECTION OF PERSONAL DATA
The right to respect for private and family life,[18] accompanied by the right to protection of personal data,[19] are both impacted by the filtering obligations under Article 17 of the DSMD. In this regard, an upload filter is likely to infringe both rights. It operates by actively processing all data stored, or in the case of deep packet inspection systems, by inspecting data packets, which allows users to be identified.[20] An understanding of this process is crucial, as filters can only effectively identify and remove infringing works by processing both content (audio, visual and captions) and context (user profile and location) on an OCSSP’s servers. The ‘context’ is particularly important, as it allows filtering algorithms to identify if a work has a license (user profile) or if it falls under a copyright exception by ascertaining previous uploads and the location of said uploads.[21]
 
Suppose one were to observe current filtering technologies. In that case, intermediaries such as YouTube employ filtering software (Content ID) to detect infringing content by scanning uploaded work against its database of uploaded audio or visual references.[22] This is an example of an algorithm that purely filters content and does not employ any sort of context filtering. Although this is submitted to be less of an infringement into the user’s aforementioned human rights, it is also comparatively less effective in enforcing IP rights and may not be able to fulfil the obligations under paragraph 4(b).
 
In addition, such a filtering process may also amount to a general monitoring obligation, as it actively monitors all data stored on an OCSSP’s server.[23] This would contradict Article 17(8) of the DSMD, which prohibits the imposition of any general monitoring obligation, unless the new position of OCSSPs under paragraph (3) justifies such an obligation.[24]
 
Under Article 17 of the DSMD, there is no express direction on the type of filter used. However, paragraph (7) requires OCSSPs to exclude copyright-exempted works.[25] Furthermore, paragraph (4) states that filtering may only be relied upon after obtaining authorisation fails, which requires the identification of works with a license.[26] By virtue of these requirements, a content-only filter would be ineffective. Thus, it is submitted that filters are unable to protect both users and rightsholders unless filtering algorithms are improved to comply with Article 17 of the DSMD without infringing a user’s privacy.
 
Therefore, it is submitted that Article 17 of the DSMD does not strike a fair balance between Article 7 and 8 of the Charter for users and in enforcing IP for rightsholders.[27]
 
III.  FREEDOM OF EXPRESSION AND INFORMATION
The right to freedom of expression and information is also affected by filtering obligations under Article 17 of the DSMD.[28] This is because filtering algorithms, in theory, are expected to distinguish between lawful and unlawful content.[29]However, most algorithms today, such as Audible Magic, rely on uploaded references by rightsholders or a pre-existing content registry.[30] Thus, their performance is reliant on limited information and is consequently subject to error.[31] A culmination of these errors could lead to over-blocking, where filters fail to distinguish content correctly. This would prevent a user’s access to lawfully uploaded copyrighted material, such as copyright exempted works.[32] As such, it infringes a user’s right to receive and impart lawful content.[33]
 
Article 17 of the DSMD has prospectively attempted to counterbalance these negative effects. Paragraph (7) aims to prevent over-blocking and emphasises the use of copyright limitations to protect a user’s right to receive and impart information.[34] However, as discussed under the previous heading, the practical limitations of current filtering algorithms may cause difficulty in respecting said aim. In addition, paragraph (9) obliges OCSSPs to provide a platform for users to dispute wrongful takedowns by upload filters.[35] As an attempt to compensate for over-blocking, it strengthens the user’s right to an effective remedy.[36] However, despite these existing measures, users are afraid to use them due to the possible risks. For instance, YouTube has such a measure, where one can appeal against content demonetisation after being reported for copyright infringement. Nonetheless, users choose not to appeal because if they lose, their entire video will be taken down.[37] Thus, it is a measure submitted to be too adversarial in practice and does not justify a potential barrier to information. Furthermore, paragraph (9) requires human review for such a measure, which incurs a further cost on OCSSPs.[38]
 
Therefore, the user’s freedom of expression is not balanced with the enforcement of IP rights unless current filtering technology improves or counterbalancing measures prove more effective.
 
IV.   FREEDOM TO CONDUCT A BUSINESS
The freedom to conduct a business according to the law[39] is related to Article 3 of Directive 2004/48/EC on the enforcement of IP rights (EIPR), which provides that enforcement must be fair and equitable in order to avoid creating a barrier to legitimate trade.[40] An active and indiscriminating filtering software, as required by Article 17 of the DSMD, is unlikely to satisfy Article 3(1) of the EIPR, as it involves the installation of a costly and complicated system at the OCSSP’s expense.[41] The high cost is attributed to the limited options available on the market, with Audible Magic and Gracenote being the most established.[42] If an OCSSP wishes to develop a proprietary filtering system, it is likely to entail even more cost and resources, with an example being YouTube’s ‘Content ID’ having a development cost of over USD $100 million.[43] Hence, Article 3(1) of the EIPR may only be satisfied once the filtering software market becomes more competitive, thereby lowering the cost of adoption. Therefore, the enforcement of IP rights by adopting an upload filter is not balanced with the OCSSP’s freedom to conduct business.
 
To substantiate these views, the case of SabamNetlog shall be discussed.[44] In Sabam, an injunction was sought to compel the use of a filtering system to identify and block infringing works off Netlog’s platform.[45] CJEU held that such an order did not strike a fair balance.[46] This was because such a filtering system would allow users to be identified and possibly block lawful content, which was complicated, costly and permanent.[47] Thus, it was not balanced against the enforcement of intellectual property rights, an outcome mirrored in the earlier decision of Scarlett Extended v Sabam.[48]
 
In contrast, CJEU in UPC Telekabel Wien v Constantin Film Verleih Gmbh allowed for an injunction to block infringing content.[49] A fair balance was held to be struck, as the service provider had the discretion to adopt any measures to comply with the injunction.[50] CJEU further elaborated that if said measures did not restrict lawful access to information while protecting IP rights, a fair balance would be achieved.[51] In addition, the measures had to be targeted, only bringing an end to the specified infringement. CJEU held that, in contrast to general filtering, the measures did not affect a user’s freedom of expression.[52] This was later reaffirmed in Tobias McFadden v Sony Music Entertainment Germany Gmbh.[53]
 
Although UPC Telekabel Wien and McFadden were concerning internet service providers instead of hosting services, they still provide useful guidance on the CJEU’s benchmark in achieving a ‘fair balance’.[54] However, this may change following Article 17 of the DSMD, as the position of OCSSPs has changed from being mere intermediaries to being argued as communicators of work to the public.[55] The latter was seen in the case of Republic of Poland v European Parliament and Council of the European Union, which concerns a claim by the Polish Government to annul the filtering obligations under Article 17 of the DSMD, on the basis that it limits the right to freedom of expression and information. This case was heard recently before the CJEU on the 10th of November 2020 and has yet to be officially reported.[56]
 
V. CONCLUSION: THE WAY FORWARD
In conclusion, it is submitted that filtering technology obliged under Article 17 of the DSMD fails to strike the right balance between enforcement of IP rights and respecting human rights, as it is only capable of favouring one party over the other. A possible solution is to not rely on upload filters, but rather to change the way uploads are made. Users may attach labels to their content to show that the content either falls under a copyright exception or that a license was obtained. These labels would notify rightsholders, who may take further action if these labels were abused with the help of OCSSPs.[57]Some may see this as contradictory to the purpose of Article 17 of the DSMD, which is to place primary liability on OCSSPs. However, the lack of active monitoring and lower reliance on filter takedowns will reduce its impact on human rights. As such, this is submitted to be a practical solution based on the analysis above.


Disclaimer: The opinions expressed in this article are those of the author and do not necessarily reflect the views of the University of Malaya Law Review, and the institution it is affiliated with.


Footnotes

[1] Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC [2019] OJ L 130/92.

[2] Senftleben, M. (2019). Bermuda Triangle - Licensing, Filtering and Privileging User-generated Content Under the New Directive on Copyright in the Digital Single Market. Vrije Universiteit Amsterdam. Retrieved from <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3367219>. Site accessed on 28 Dec 2019.

[3] Bridy, A. (2019). The Price of Closing the ‘Value Gap’: How the Music Industry Hacked EU Copyright Reform. Vanderbilt Journal of Entertainment & Technology Law, 22, 323. Retrieved from <https://papers.ssrn.com/sol3/papers.cfm?abstractid=3412249>. Site accessed on 28 Dec 2019.

[4] See footnote 1 above, art 17(3).

[5] See footnote 1 above, art 17(4)(a) & (b).

[6] See footnote 1 above, art 17(7).

[7] See footnote 1 above, art 2(6).

[8] See footnote 1 above, recital 62.

[9] Grisse, K. (2019). After the storm—examining the final version of Article 17 of the new Directive (EU) 2019/790. Journal of Intellectual Property Law & Practice, 14(11), 887, 888.

[10] Charter of Fundamental Rights of the European Union [2012] OJ C 326 E/391, art 16.

[11] See footnote 1 above, recital 63.

[12] See footnote 1 above, art 17(4)(b).

[13] See footnote 1 above, art 17(4)(c).

[14] See footnote 1 above, art 17(4)(b).

[15] Tuchtfeld, E. (2019, Mar 25). Filtering Fundamental Rights. The European Union’s balancing of intellectual property and the freedom to receive information. Volkerrechtsblog. Retrieved from <https://voelkerrechtsblog.org/filtering-fundamental-rights/>. Site accessed on 2 Jan 2020.

[16] See footnote 10 above.

[17] Productores de Musica de Espana (Promuscia) v Telefonica de Espana Sau (C-275/06) [2008] ECR 54, 68.

[18] See footnote 10 above, art 7.

[19] See footnote 10 above, art 8.

[20] Romero-Moreno, F. (2019). ‘Notice and Staydown’ and Social Media: Amending Article 13 of the Proposed Directive on Copyright. International Review of Law, Computers & Technology, 33(2), 187, 189.

[21] Engeler, M., & Louven, S. (2019, Mar 23). Copyright Directive: Does the Best Effort Principle Comply With GDPR? Telemedicus. Retrieved from <https://www.telemedicus.info/copyright-directive-does-the-best-effort-principle-comply-with-gdpr/>. Site accessed on 2 Jan 2020.

[22] Google. (n.d.). How Content ID Works. Google Support. Retrieved from <https://support.google.com/youtube/answer/2797370?hl=en>. Site accessed on 3 Jan 2020.

[23] Belgische Vereniging van Auteurs Componisten en Uitgevers CVBA (SABAM) v Netlog NV (C-360/10) [2012] ECR 85, 38.

[24] See footnote 1 above, art 17(3) & (8).

[25] See footnote 1 above, art 17(7).

[26] See footnote 1 above, art 17(4).

[27] See footnote 10 above, art 17(2).

[28] See footnote 10 above, art 11.

[29] Ferrer, V. (2019). Right this Way: A Potential Artificial Intelligence-Based Solution for Complying with Article 13 of the EU’s 2018 Copyright Directive. Law School Student Scholarship, 948, 23. Retrieved from <https://scholarship.shu.edu/student_scholarship/948/>. Site accessed 2 Jan 2020.

[30] Audible Magic. (n.d.). Our Core Technology. Audible Magic. Retrieved from <https://www.audiblemagic.com/technology/>. Site accessed on 4 Jan 2020.

[31] See footnote 10 above, 894.

[32] Van der Sar, E. (2018, Sept 3). YouTube’s Content-ID Flags Music Prof’s Public Domain Beethoven and Wagner Uploads. TorrentFreak. Retrieved from <https://torrentfreak.com/youtube-targets-music-profs-public-domain-beethoven-and-wagner-uploads-180903/>. Site accessed on 4 Jan 2020.

[33] See footnote 10 above, art 11(1).

[34] See footnote 1 above, art 17(7).

[35] See footnote 1 above, art 17(9).

[36] See footnote 10 above, art 47.

[37] Electronic Frontier Foundation. (2019, Jan 8). The Mistake So Bad, That Even YouTube Says Its Copyright Bot ‘Really Blew It’. Electronic Frontier Foundation. Retrieved from <https://www.eff.org/takedowns/mistake-so-bad-even-youtube-says-its-copyright-bot-really-blew-it>. Site accessed on 4 Jan 2020.

[38] See footnote 1 above, art 17(9).

[39] See footnote 1 above, art 16.

[40] Corrigendum to Directive 2004/48/EC of the European Parliament and of the Council of 29 April 2004 on the enforcement of intellectual property rights [2004] OJ L 195/16, art 3(1) & (2).

[41] See footnote 23 above, 45-47.

[42] Owler. (n.d.). Audible Magic’s Competitors, Revenue and Employees. Owler. Retrieved from <https://www.owler.com/company/audiblemagic#overview>. Site accessed on 8 Jan 2020.

[43] Google. (2018, Nov). How Google Fights Piracy. Google, 27. Retrieved from <https://www.blog.google/documents/25/GO806_Google_FightsPiracy_eReader_final.pdf>. Site accessed on 8 Jan 2020.

[44] See footnote 23 above.

[45] See footnote 23 above, 26.

[46] See footnote 23 above, 51.

[47] See footnote 23 above, 48-49.

[48] Scarlett Extended Sa v Societe belge des auteurus, compositeurs et editeurs SCRL (SABAM) (C-70/10) [2011] ECR I-11959, 53.

[49] UPC Telekabel Wien Gmbh v Constantin Film Verleih Gmbh, Wega Filmproduktionsgesellschaft mbH (C-314/12) [2014] ECR 689, 63.

[50] See footnote 49 above, 52.

[51] See footnote 49 above, 63.

[52] See footnote 49 above, 56.

[53] Tobias McFadden v Sony Music Entertainment Germany Gmbh (C-484/14) [2016] ECR 689, 93.

[54] Angelopoulos, C. (2017, Jan). On Online Platforms and the Commission’s New Proposal for a Directive on Copyright in the Digital Single Market. Centre for Intellectual Property and Information Law. Retrieved from <https://juliareda.eu/wp-content/uploads/2017/03/angelopoulos_platforms_copyright_study.pdf>. Site accessed on 10 Jan 2020.

[55] See footnote 1 above, art 17(1).

[56] Republic of Poland v European Parliament and Council of the European Union (C-401/19) [2019] OJ C 270, 21.

[57] See footnote 9 above, 898.

Previous
Previous

The Lives of 1086 Myanmar Nationals – Why the Rush?

Next
Next

Copyright Issues on Virtual Private Network