Digital Services Act (DSA) comes into effect for all intermediary services providers

19/02/24
Digital Services Act (DSA) comes into effect for all intermediary services providers

February 17, 2024, marks the full enforcement date of the DSA regulation for all digital market players, such as social media, content sharing, e-commerce platforms and hosting service providers.

 

While the regulation was already applicable to very large online platforms (VLOP) and very large online search engines (VLOSE) since late August 2023, it now applies to all providers offering intermediary services to recipients within the European Union, regardless of the establishment place of these service providers. This pivotal moment thus marks a significant milestone in terms of managing illegal content, risks, and transparency obligations across the sector, whose implementation will be supervised by the national Digital Services Coordinators (“DSC”) and the European Board for Digital Services (“Board”).

 

Designation of Digital Services Coordinators

 

Member States were also required, by February 17, 2024, to designate a competent authority as DSC, in accordance with Article 49 of the DSA. The primary role of each coordinator is to monitor and ensure the compliance with the regulation at the national level, fostering coordination across the board. They also serve as a point of contact for service recipients seeking to lodge complaints.

These Digital Service Coordinators collectively form the European Board for Digital Services, tasked with monitoring, coordination, and the development of European standards for the implementation of the DSA, in accordance with Articles 61 to 63 of the DSA.

France stands among the first four Member States to have signed administrative arrangements with the European Commission, aiming to bolster the efficient coordination under the DSA. Its Secure and Regulate the Digital Space Bill designates the Regulatory Authority for Audiovisual and Digital Communication (ARCOM) as the Digital Services Coordinator in France. To date, the following Member States have officially designated their Digital Services Coordinator:

 

 

In light of the European Commission’s Recommendation on coordinating their response to the spread and amplification of illegal content dated October 20, 2023, an Informal Network was formed, comprising both designated and potentially designated Digital Services Coordinators. This assembly, which convened several times between late 2023 and early 2024, played a pivotal role in laying the foundational strategies for the DSA’s effective implementation.

Henceforth, the Board will adopt the responsibilities previously managed by this Informal Network, including issuing guidelines or recommendations, coordinating joint investigations, thus promoting a consistent application of the regulation, thereby reinforcing the harmonization of regulatory oversight across national boundaries.

 

Highlight on France: Aligning national legislation with the DSA

 

France is currently refining its legislative framework through the Secure and Regulate the Digital Space Bill (the “Bill”), with an objective to adapt the national law to the DSA. This bill is presently under the scrutiny of the Joint Parliamentary Committee.

On October 25, 2023, the European Commission issued a detailed opinion, raising its concerns regarding several provisions contained in the Bill, as passed by the National Assembly, notably on the provisions related to the protection of minors and the regulatory powers granted to French authorities, such as introducing bans and requiring pre-access warning messages for certain illicit content. The Commission highlighted that such provisions challenge the principle of “country of origin control” and encroach upon its exclusive authority to oversee VLOPs and VLOSEs.

 

France is consequently required to take into account the observations of the European Commission, thereby prompting parliamentarians to make necessary amendments to the text to ensure its compliance, thereby causing a delay in its enactment.

 

Enhanced oversight: Rigorous compliance monitoring by the European Commission

 

The enforcement of the DSA over the past year, particularly concerning very large online platforms (VLOPs) and very large online search engines (VLOSEs), has underscored the European Commission’s steadfast commitment to ensuring compliance with the DSA. Since October 2023, this commitment has been evidenced by the dispatch of inquiries to seventeen VLOPs and VLOSEs and the initiation of formal proceedings against X (formerly known as Twitter), scrutinizing their compliance with the obligations to mitigate illegal content dissemination, to counteract information manipulation and to provide researchers with data access.

With the advent of February 17, the regulatory framework broadens, potentially ushering in a phase of investigative scrutiny for non-VLOP and non-VLOSE entities, in line with the responsibilities vested in the DSCs and the Board, who are charged with the application and compliance monitoring of the DSA.

Moreover, it’s also worth noting that a dedicated working group focused on fostering cooperation “to establish best practices and standards for age verification” was established. This initiative, which convened for its inaugural meeting in January 2024, underscores an intensified commitment to DSA’s provisions regarding the protection of minors, highlighting an area of increased regulatory attention. This is further evidenced by the European Commission’s initiation of a formal proceeding against TikTok today, which marks the second DSA formal proceeding after X. TikTok could potentially face fines amounting to 6% of its global turnover if found to be in violation of DSA regulation.

Regarding the widely discussed topic of AI, the Commission clarified through an answer that “Where an AI system is embedded in a service of an online platform, it can be considered an algorithmic system within this service and as such will fall within the scope of the DSA.”

 

To go further