Share This Article
Obligations under the Digital Services Act (DSA) for large online platforms, the 17 Very Large Online Platforms (VLOPs) and 2 Very Large Online Search Engines (VLOSEs), designated by the European Commission are in force.
Under the Digital Services Act (DSA), the European Commission (EC) designated 17 Very Large Online Platforms (VLOPs) and 2 Very Large Online Search Engines (VLOSEs), i.e., with an average monthly number of active service recipients in the EU of 45 million or more. The platforms were designated by the EC based on user data they were required to publish by February 17, 2023 (full list of designated platforms available here).
Designated companies had 4 months, expiring today August 25, 2023, to comply with all new obligations set by the DSA. These obligations are aimed at empowering autonomy and accountability and protecting online users, including minors, by requiring designated services to assess and mitigate their systemic risks and have robust content moderation tools in place.
With regard to terms and conditions of service, providers of VLOPs and VLOPEs shall now provide recipients with a concise, easily accessible and machine-readable summary of the general terms and conditions, including available remedies and remedies, in clear and unambiguous language. The general conditions should be published in all official languages of the member states in which they offer their services.
In terms of risk assessment, these providers shall identify, analyze and diligently assess any systemic risks in the EU arising from the design or operation of their service and related systems, including algorithmic systems, or from the use of their services. Systemic risks to be mitigated include, for example, (i) the dissemination of illegal content, (ii) any current or foreseeable adverse effects on the exercise of fundamental rights, including freedom of expression and information, including media freedom and pluralism, and non-discrimination, (iii) specific risks related to gender-based violence, protection of public health and the person, physical and mental, including minors.
In conducting assessments, providers should also analyze whether and how risks are affected by intentional manipulation of their service, including through inauthentic use and automated exploitation of the service, as well as the potentially rapid and wide amplification and dissemination of illegal content and information incompatible with the general conditions.
Where appropriate under the DSA, risk mitigation measures may include, among others, (i) the adjustment of content moderation procedures, including the speed and quality of processing of reports concerning specific types of illegal content (e.g. illegal incitement to hatred, online violence) and, where appropriate, the prompt removal of the notified content or disabling access to it, as well as the adjustment of all relevant decision-making processes and resources dedicated to content moderation, (ii) the adoption of outreach measures and the adaptation of their online interface in order to give service recipients more information, (iii) the use of a prominently displayed marker to make sure that an element of a piece of information (e.g. image, audio, or video content) generated or manipulated, that bears a striking resemblance to existing persons, objects, places, or other entities or events, and that to a person appears falsely authentic or truthful, is distinguishable when presented on their online interfaces, in conjunction with the provision of a user-friendly feature that allows service recipients to indicate such information.
With regard to minors, providers of online platforms accessible to minors (i) will be required to take appropriate and proportionate measures to ensure a high level of privacy, safety, and security for minors on their service, (ii) will no longer be allowed to present on their interface advertising based on the profiling of minors, (iii) will be required to conduct risk assessments specific to their services and proportionate to systemic risks including any adverse effects in relation to the protection of children’s health and serious consequences for their physical and mental well-being, and (iv) will be required to take reasonable, proportionate, and effective mitigation measures tailored to the specific systemic risks identified, including, where appropriate, adopting targeted measures to protect children’s rights, including age verification and parental control tools, or tools designed to help children report abuse or obtain support.
Providers of VLOPs and VLOPEs will also be required to undergo, at their own expense and at least 1 time per year, independent reviews aimed at assessing their compliance with obligations under the DSA regarding duty of care for a transparent and safe online environment and their commitments under codes of conduct, including for online advertising, and developed crisis protocols. These revisions include access by mandated organizations to all relevant data and premises and response to any questions asked.
In terms of increased power granted to users, providers of online VLOPs and VLOPEs platforms using recommender systems will need to ensure at least one option for each of their systems that is not based on profiling. In addition, those who present advertisements on their online interfaces will be required to compile and make publicly accessible a register containing at a minimum the information specified in the DSA, including (i) the content of the advertisement, (ii) the person on whose behalf the advertisement is presented and the person who paid for it, if different, (iii) an indication designed to specify whether the advertisement was intended to be presented to one or more specific target groups and, if so, the main parameters used for this purpose, including those used to exclude one or more of these groups, (iv) the commercial communications. The aforementioned providers will have to undertake that the registry will not contain personal data of the service recipients to whom the advertisement was or could have been presented.
In addition, providers of VLOPs and VLOPEs will be required to provide the digital services coordinator of the place of establishment or the EC, upon their reasoned request and within a reasonable period of time, access to the data necessary to monitor and assess compliance with the DSA through appropriate interfaces (e.g., online databases, APIs). Upon the reasoned request of the coordinator, providers will also be required to provide access to publicly accessible data to qualified researchers for the purpose of conducting research that contributes to the detection, identification, and understanding of systemic risks in the EU, as well as for the assessment of the adequacy, efficiency, and impacts of risk mitigation measures (here the EC’s call for input on the provisions under the DSA regarding researchers’ access to data).
Again, a compliance monitoring function independent of their operational functions and consisting of one or more compliance officers, including the head of the compliance monitoring function, will need to be established by the providers and will need to have sufficient authority, status, and resources, as well as access to the platform provider’s governing body to monitor its compliance with the DSA.
Compliance with the DSA will be ensured by a pan-European supervisory architecture. Although the competent authority for supervision of designated platforms and search engines is the EC, the EC will cooperate with digital service coordinators in the supervisory framework established by the DSA. These national authorities, also responsible for the supervision of smaller platforms and search engines, are to be established by EU member states by February 17, 2024. By the same date, all other platforms will have to comply with the obligations under the DSA and provide their users with the protection and safeguards under the DSA.
To ensure compliance with the DSA, the EC is also strengthening its internal and external multidisciplinary expertise and recently established the European Center for Algorithmic Transparency (ECAT), which will assist it by assessing whether the operation of algorithmic systems is in line with risk management obligations. The EC is also establishing a digital application ecosystem that brings together expertise from all relevant sectors.
On a similar topic, you may find the following article interesting “The Digital Market Act is in place and wants to change the Internet“.
Photo by Pathum Danthanarayana on Unsplash