Reporting cannot exist without prior awareness. The anticipated risk we will explore here is that while Trusted Flaggers are facilitated in the reporting procedures, there are no facilitated procedures on the other side for monitoring and identifying reportable situations. A critical aspect for enforcement.
The Digital Services Act (DSA) has been fully operational since August 25, 2023. Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) of significant size must comply with the provisions of EU Regulation 2022/2065 concerning a single market for digital services. Looking ahead, from February 17, 2024, the Digital Services Act will be fully applicable to all other platforms (1) it addresses (2). This significant step obligates 17 VLOPs (3) and 2 VLOSEs (4) as of now to adhere to the new rules bringing European values into the digital world. These were the words announced by the President of the European Commission, Ursula von der Leyen, on the same day, August 25, 2023.
Based on the classification determined by the size in terms of user volumes of the internet intermediary and based on the role, a gradual series of obligations and duties is envisaged. To name a few, these include transparency in the reporting activity, procedures for verifying and identifying their user base (known your business customer – KYBC), the obligation to report to the Judicial Authority in case of offenses, the necessary establishment of facilitated procedures for receiving reports from Trusted Flaggers, and finally, the implementation of restrictive and penalizing measures for repeat offenders.
Among these obligations, there is certainly a dual essential burden for online enforcement. In fact, for any internet intermediary, even if based outside the EU territory, there is a need to provide the European Commission with contact points and references of the legal representatives, as well as the responsibility to cooperate with national authorities. This, hopefully, can help overcome the obstacles faced today in investigative activities that require the collaboration of internet services across various territories, exploited for the pursued illicit activities.
In this article, we will pay closer attention to Trusted Flaggers, the envisaged facilitations, and the limitations as perceived by the authors.
On September 15, 2023, the text of the so-called DL (law Decree) Caivano was published in the Official Gazette no. 216, later transmitted to the Senate for the initiation of the legislative conversion process, assuming the numbering AS 878. The text is currently in the allocation phase. The article (5) inserted in the text designates Agcom as the DSA coordinator in Italy.
But who exactly are Trusted Flaggers?
The term “Trusted Flagger” refers to an individual or entity, also represented by an online service provider, believed to have expertise in the reporting activity. Internet platforms and, in general, internet service intermediaries have involved many of these reliable reporters to counteract hate speech and harmful content on their platforms, prioritizing their notifications over user reports.
However, the new law on digital services has transformed the role of Trusted Flaggers within the EU.
Trusted Flaggers will be selected by the designated authority in the EU member state concerned, or online service providers will have to apply to obtain the status of a “Trusted Flagger.” The conditions to be met are:
• Proven experience and expertise in detecting, identifying, and reporting illegal or hate-inciting content.
• Representation of collective interests and independence from online platforms.
• Conducting their activity with the aim of submitting reports promptly, diligently, and objectively.
The responsibilities and obligations envisaged for Trusted Flaggers are divided into:
• Transparency reports:
1. Classified notices based on the identity of the provider.
2. Type of reported content.
3. Specific legal norms presumably violated by the reported content.
4. Action taken by the provider.
5. Any potential conflicts of interest and sources of funding.
6. An explanation of the procedures in place to ensure that the Trusted Flagger maintains their independence.
7. Trust funds must also publish annual reports on transparency and funding structure, including sources and amounts of income.
• Notifications and reporting:
1. Standardized electronic sending of violation notices.
2. The possibility of issuing correction notices for incorrect removals, restrictions, blocked content, and suspensions or closures of accounts.
The coordinator of digital services of a member state, in the case of Italy, as mentioned above, AGCOM, can renew the status of a Trusted Flagger if they continue to meet the regulation’s requirements. Online platforms, in turn, can “flag” reliable reporters based on the significant number of insufficient or inadequate notifications, which can initiate an investigation by the DSA Coordinator. It is also possible that the status of a Trusted Flagger may be suspended and eventually revoked if it is found that they no longer meet the required conditions.
Therefore, the new regulation imposes new responsibilities on these entities, as organizations wishing to obtain the Trusted Flagger status will need to improve existing tools and processes to meet new obligations in terms of transparency and verifiability. These tools and processes should enable these organizations to ensure that their notifications are sufficiently and adequately detailed to avoid the suspension of their status. Additionally, they should integrate necessary tracking mechanisms in case of an investigation by the designated DSA Coordinator.
In essence, coming to the conclusions, Article 19 (6) of the DSA governs the role of Trusted Flaggers, while Article 14 (7) defines the notification mechanism and the conditions that must be met for proper reporting. The obligations of defining facilitated procedures for receiving violation notifications and their rapid handling by the internet intermediary are well defined. However, equally useful and essential definitions are lacking in our opinion, especially for a Trusted Flagger, which stipulate the obligation imposed on content internet platforms, at least VLOPs and VLOSEs, to provide facilitated tools and procedures for global monitoring and content search (without geographical or public visibility restrictions). Therefore, from the enforcement perspective, the DSA only encourages the use of automated processes for communication with Trusted Flaggers for receiving and managing abuse reports (e.g., APIs) but does not provide equally binding obligations to facilitate automation of monitoring and search for violations by Trusted Flaggers.
This lack risks, over time, making the fight against large-scale violations ineffective since content platforms implement security and control measures to protect their “own” data. Countermeasures and technologies (anti-bot techniques) can identify massive monitoring activity (in more technical terms, scraping) and limit it, even to the point of blocking access.
These countermeasures currently represent the main obstacle faced, on the technical and practical level of the operational process, by a right-holder or a company specialized in monitoring the internet, whose access to the content platform is not distinguished from others – thus undergoing a significant limitation in subsequent reporting activity.
The institution of Trusted Flaggers may be suitable for accelerating the removal of illegal content and making it more reliable. In practice, it remains to be seen whether the institution can have the effect of diminishing the impact of removal notices from entities that are not Trusted Flaggers.
In the coming months, the Commission should adopt a series of delegated acts designed to contribute to the implementation of the new law, to provide greater clarity on platform obligations and increase legal certainty to guide compliance plans.
So far, none of the VLOP and VLOSE has publicly refused to comply with the DSA. However, the retail platforms Amazon and Zalando have both contested being included in the list.
Zalando stated that “the European Commission did not take into account the predominantly retail nature of its business model and that it does not present a ‘systemic risk’ of spreading harmful or illegal content by third parties.” An Amazon spokesperson told The Verge that the company “does not fall under the description of ‘Very Large Online Platform’ under the DSA” and was “unfairly singled out.“
However, the latter has added a function that allows users to report illegal products sold on its platform, despite claiming that the “burdensome administrative obligations” of the DSA do not benefit its European consumers.
Meta also allows Instagram and Facebook users to challenge moderation decisions on their content, while TikTok users can choose not to receive personalized recommendations. Google, Snap, and Meta have already submitted a list of measures taken to meet the requirements of the DSA.
Will it be enough to satisfy regulatory authorities? Only time will tell, but regardless, it is unlikely that these VLOP and VLOSE can relax and let the situation go.
******
Note:
(1) Defined by the DSA are “online platforms,” “hosting services,” and “all intermediaries.”
(2) For further details on the DSA regulations, refer to the whitepaper available at the following link: Digital Services Act Whitepaper.
(3) Notable platforms falling under these categories include Alibaba Express, Amazon Store, Apple AppStore, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, X, Wikipedia, YouTube, Zalando
(4) Bing, and Google Search.
(5) Article 15, addressing the appointment of the coordinator of digital services in implementation of EU Regulation 2022/2065 on digital services, outlines the following:
Paragraph 1 designates AGCOM (Autorità per le Garanzie nelle Comunicazioni) as the Coordinator of Digital Services to ensure the effectiveness of the obligations and rights established by EU Regulation 2022/2065 concerning a single market for digital services. This designation also includes supervision regarding the protection of minors.
Paragraph 2 stipulates that AGCM (Autorità Garante della Concorrenza e del Mercato), the Privacy Guarantor, and other relevant authorities ensure their collaboration with AGCOM.
Paragraph 3 refers to a provision by AGCOM for defining the conditions, procedures, and operational modalities for the exercise of its functions.
Paragraph 4 introduces coordination amendments to Article 1 of Law No. 249/1997, listing the tasks of AGCOM as the Coordinator of Digital Services. Notably, in case of violations of obligations outlined in Articles 9, 14, 15, 23, 24, 26, 27, 28, 30, 45, 46, 47, and 48 of EU Regulation 2022/2065, AGCOM can impose administrative and pecuniary sanctions up to a maximum of 6% of the worldwide annual turnover in the financial year preceding the commencement of the procedure, for the provider of an intermediary service falling within its sphere of competence, also in its capacity as the Coordinator of Digital Services, in accordance with national and European law applicable to the offense.
Paragraphs 5 and 6 provide provisions regarding the increase in AGCOM personnel.
(6) https://digitalservicesact.cc/dsa/art19.html
(7) https://digitalservicesact.cc/dsa/art14.html