The Digital Services Act (DSA) introduced several important innovations regarding the regulation of the functioning of online platforms such as Facebook or YouTube. Since the DSA has already been reviewed at various fora, including our blog, we would rather draw attention here to one of the painful shortcomings of the regulation.
During the trilogue, there was a lively debate about how the EU should deal with online platforms where user-generated pornographic content can be published. According to the original regulatory concept, the service provider of an online platform the primary purpose of which is to share pornographic footage must provide enhanced protection to its users through various technical and organizational measures.
As a primary technical measure, service providers require content producers to identify themselves by providing their e-mail address and telephone number. Thus, registrants who do not reach the age limit can be rejected, and the perpetrator can be identified easily in the event of a possible violation of the law. As an organizational measure, service providers must employ human moderators who are qualified to identify image-based sexual abuse, including illegal sexual content. In addition to the complaint handling procedure applicable to all platform providers, a qualified complaint handling procedure should also be established that enables the rapid identification and removal of sexual recordings published without consent.
However, the referenced article was deleted from the final text of the DSA for an unclear reason. However, there would be a great need for a more differentiated platform regulation than at present, as well as for more detailed or stricter rules for platforms where content that might be unlawful and/or harmful to children is most likely to appear.
However, experience shows that the platforms hosting pornographic content are just like that. The number of complaints about revenge porn doubled in the United Kingdom in 2020, while, in another survey of 180 countries, 52% of young girls and women reported that they had already been abused in some form on the world wide web. Most of the reports mentioned the publication of private images and information without consent, sexual harassment, and threats.
In the case of one of the largest porn sharing sites, scandals seem never to end because of illegal materials found in the service, mainly child pornography, recordings depicting real sexual violence, and sexual recordings published without consent. In 2020, the New York Times published a lengthy article about the fact that the Pornhub platform, to which users can also upload content, hosts countless recordings depicting minors. Following the article, two major credit card providers, Visa and Mastercard, suspended their payment services through the platform.
Although the platform provider promised to pay more attention to content filtering, it did not take long for the next scandal to explode. In the summer of 2022, the New Yorker published an article about the fact that it is very hard for victims to have the recordings depicting them deleted. The stories described in the article indicate that the platform providers almost drag the victims through the mud, who often need to retain a lawyer to be able to justify, in a manner acceptable to the service provider, why they request the removal of a footage they appear in. And the recordings in question often reappear soon after removal.
As of today, porn sharing providers only have to comply with the general rules applicable to all online platforms. According to Article 7 DSA, such platform providers are not subject to any general control obligation, i.e. they are not required to automatically filter either child pornography or re-uploaded content that has already been classified as unlawful. Since the DSA is a regulation that implements complete harmonization, Member States will not have any opportunity to establish more detailed or stricter rules for certain types of services.
Although the recently adopted DSA fails to provide solutions to countless problems, it does have some progressive provisions. For example, service providers will be required to establish contact points through which Member State authorities can establish and keep contact with them (Article 10). Service providers will also be required to include clear provisions on complaint handling in their terms of use (Article 12) and to publish annual transparency reports featuring statistics on content moderation (Article 13). Hosting providers, including video sharing providers, must establish notice-action mechanisms that facilitate the effective investigation of individual complaints, and providers will be obliged to notify the law enforcement authorities at their places of establishment about any illegal content they may identify (Article 15a).
It remains to be seen whether these are the rules will be implemented efficiently, as the DSA leaves enforcement to the Member States. This means that Member States will monitor whether the platforms under their jurisdiction comply with the provisions of the DSA, and they also establish the sanctions that must be applied in case of violations; the maximum fine may not exceed 6% of the annual revenue or sales of the service provider concerned.
Image source: Unsplash