Prepare for the DSA: Far-Reaching Obligations for Digital Services

Published on 6 September 2024 categories , ,

Since February 17, 2024, the Digital Services Act (“DSA”) has come into force, marking a significant European regulation that profoundly reshapes the rules governing online services. The DSA focuses on so-called intermediary services, which are responsible for transmitting, storing, and/or disseminating information on behalf of their users, such as platforms where content is shared and marketplaces for goods and services.

With the introduction of the DSA, the aim is to create a safer and more reliable online environment by tackling the spread of illegal content and disinformation and better protecting the fundamental rights of users of intermediary services. In addition to imposing obligations on intermediary services, such as duties of care regarding potentially illegal content, the DSA also regulates the liability of these platforms for illegal content posted by their users. Although this is an important part of the legislation, this blog will primarily focus on the duties of care imposed by the DSA on intermediary services.


When Is an Intermediary Service an Online Platform?

The DSA applies to various types of intermediary services. An intermediary service is a digital service that plays a role in transmitting, storing, or disseminating information on behalf of its users. These services act as a intermediary in the digital world, facilitating user information without creating it themselves.

Within the category of intermediary services, there are different types of services, including hosting services and online platforms:

  • Hosting services are services focused on storing information at the request of the user. A broad spectrum of online services falls under this category, such as cloud services, web hosting, traffic generation services, and email services. These services ensure that information is stored and remains accessible to the user, but they do not further disseminate this information to a large audience.
  • Online platforms, on the other hand, go a step further. In addition to storing information, they also make it accessible to a large audience. This means that the information a user posts on the platform becomes available to many people without the user having to take additional actions. Examples of online platforms include social networks like Facebook and Instagram, where users share posts and photos that can be viewed by many, and online marketplaces like eBay and Amazon, where products are offered to a large number of consumers.

The obligations for online platforms under the DSA do not apply to micro and small companies. A company is considered micro or small if it has fewer than 50 employees and an annual turnover or balance sheet total of less than 10 million euros. This exception is intended to relieve smaller businesses of the significant burdens associated with complying with the DSA’s obligations. Should a company grow and no longer meet these criteria, it has twelve months after losing this status to comply with the obligations applicable to online platform providers.

The DSA also recognizes the existence of Very Large Online Platforms (“VLOPs”) and Very Large Online Search Engines (“VLOSEs”). These specific categories of online platforms are designated as such by the European Commission when they have an average of more than 45 million active users per month within the EU over a six-month period. Examples of such platforms include well-known social media like X and Instagram, major online marketplaces like Amazon and Airbnb, and search engines like Google. VLOPs and VLOSEs, due to their influential role within the EU, have additional obligations compared to ‘normal’ online platforms.

Obligations Regarding Tackling Illegal Content

Providers of online platforms face various responsibilities under the DSA to combat illegal content, as they may knowingly or unknowingly play a role in its dissemination. Although the DSA does not impose a general obligation to constantly monitor all content, it does create a system of due diligence obligations to prevent and promptly make illegal content inaccessible:

  • Providers of all types of intermediary services must include clear information in their terms and conditions about the restrictions they impose on content provided by users, including details about policies, procedures, measures, and tools used for content moderation.
  • Additionally, providers of hosting services, as a specific category of intermediary services, must offer a user-friendly way for individuals or organizations to report illegal content, make decisions about these reports carefully, timely, non-arbitrarily, and objectively, and provide clear explanations of the decisions made and the available appeal options.
  • Providers of online platforms have additional obligations: they must suspend users who repeatedly post clearly illegal content, as well as users who frequently file unfounded complaints, and prioritize and promptly process and handle reports from trusted flaggers (certified organizations that identify illegal online content).

Obligations Regarding Users’ Rights

It is permissible for a provider of an intermediary service to impose restrictions on users, such as suspending a user’s account when they have posted illegal content or violated the terms and conditions. As such decisions can have a significant impact on the user, the DSA provides users with the opportunity to object, for example, by filing a complaint. To ensure this process is fair and transparent, the DSA imposes several obligations on providers of intermediary services:

  • Firstly, providers of intermediary services must clearly explain in their terms and conditions how their internal complaint handling system works. All types of intermediary services are also required to produce an annual transparency report. This report must include the measures they have taken to combat illegal content, how reports of illegal content have been handled, and what usage restrictions have been imposed. However, the previously discussed micro and small enterprises are exempt from the obligation to produce transparency reports.
  • Providers of hosting services must also notify individuals or organizations who report illegal content of the decisions made in response to these reports and inform users when their use of the service is restricted, including an explanation of the reason.
  • For providers of online platforms, additional obligations apply. They must record their decisions to impose restrictions on users, including the motivation for these decisions, in a database of the European Commission. Additionally, they must provide users with access to an effective internal complaint handling system, where complaints can be submitted electronically and free of charge. Users must also be informed of the possibility to have disputes resolved by a certified out-of-court dispute settlement body. Finally, online platforms must include information in their terms and conditions about the main parameters used in their recommendation systems and how users can adjust these.

Obligations for Business-to-Consumer (B2C) Online Marketplaces

B2C online marketplaces are subject to additional requirements as these measures are intended to protect consumers and simultaneously discourage traders from offering products or services that violate applicable laws and regulations. Therefore, B2C online marketplace providers are required to ensure the traceability of sellers using their platform. This means they must implement measures to identify and verify the identity of sellers, making every reasonable effort to assess the reliability of the information provided. Additionally, they must design and organize their online interface in such a way that sellers can comply with consumer law.

Obligations Regarding the Protection of Minors

In addition to the general due diligence obligations, the DSA also imposes specific obligations aimed at protecting minor users. Providers of intermediary services must explain the terms of use of their service, as well as the restrictions on it, in a way that is understandable to minors, especially if the service is primarily aimed at or used by them. In addition to this, providers of online platforms must take measures to ensure a high level of privacy, safety, and protection for minors when their platform is accessible to minors. Moreover, they must not display ads based on profiling when they are reasonably aware that their service is being used by minors.

Conclusion

The DSA introduces many obligations for providers of intermediary services, including online platforms. Complying with these rules requires significant efforts, from tackling illegal content to protecting minors and ensuring consumer rights. Therefore, it is essential to carefully evaluate whether your service falls under the DSA and what specific obligations apply.

Share:

publications

Related posts