Questions and Answers: Digital Services Act*

(Source: European Commission)


  1. General information on the Digital Services Act
  2. Impact of the Digital Services Act on users
  3. Impact of the Digital Services Act on businesses
  4. Impact of the Digital Services Act on Member States

1.  General information on the Digital Services Act

What is the Digital Services Act?

The Digital Services Act (DSA) regulates the obligations of digital services that act as intermediaries in their role of connecting consumers with goods, services, and content. This includes online marketplaces amongst others.

It will give better protection to users and to fundamental rights online, establish a powerful transparency and accountability framework for online platforms and provide a single, uniform framework across the EU.

The European Parliament and Council reached a political agreement on the new rules on 23 April. The co-legislators should formally approve the rules soon.

The Digital Services Act is a Regulation that will be directly applicable across the EU. Some of the obligations include:

  • Measures to counter illegal content online, including illegal goods and services. The DSA imposes new mechanisms allowing users to flag illegal content online, and for platforms to cooperate with specialised ‘trusted flaggers’ to identify and remove illegal content;
  • New rules to trace sellers on online market places, to help build trust and go after scammers more easily; a new obligation by online market places to randomly check against existing databases whether products or services on their sites are compliant; sustained efforts to enhance the traceability of products through advanced technological solutions;
  • Effective safeguards for users, including the possibility to challenge platforms’ content moderation decisions based on a new obligatory information to users when their content gets removed or restricted;
  • Wide ranging transparency measures for online platforms, including better information on terms and conditions, as well as transparency on the algorithms used for recommending content or products to users;
  • New obligations for the protection of minors on any platform in the EU;
  • Obligations for very large online platforms and search engines to prevent abuse of their systems by taking risk-based action, including oversight through independent audits of their risk management measures. Platforms must mitigate against risks such as disinformation or election manipulation, cyber violence against women, or harms to minors online. These measures must be carefully balanced against restrictions of freedom of expression, and are subject to independent audits;
  • A new crisis response mechanism in cases of serious threat for public health and security crises, such as a pandemic or a war;
  • Bans on targeted advertising on online platforms by profiling children or based on special categories of personal data such as ethnicity, political views or sexual orientation. Enhanced transparency for all advertising on online platforms and influencers’ commercial communications;
  • A ban on using so-called ‘dark patterns’ on the interface of online platforms, referring to misleading tricks that manipulate users into choices they do not intend to make;
  • New provisions to allow access to data to researchers of key platforms, in order to scrutinise how platforms work and how online risks evolve;
  • Users will have new rights, including a right to complain to the platform, seek out-of-court settlements, complain to their national authority in their own language, or seek compensation for breaches of the rules. Representative organisations will also be able to defend user rights for large scale breaches of the law;
  • A unique oversight structure. The Commission will be the primary regulator for very large online platforms (reaching 45 million users), while other platforms will be under the supervision of Member States where they are established. The Commission will have enforcement powers similar to those it has under anti-trust proceedings. An EU-wide cooperation mechanism will be established between national regulators and the Commission;
  • The liability rules for intermediaries have been reconfirmed and updated by the co-legislator, including a Europe-wide prohibition of generalised monitoring obligations.

Does the Digital Services Act define what is illegal online?

No. The new rules set out EU-wide rules that cover detection, flagging and removal of illegal content, as well as a new risk assessment framework for very large online platforms and search engines on how illegal content spreads on their service.

What constituted illegal content is defined in other laws either at EU level or at national level – for example terrorist content or child sexual abuse material or illegal hate speech is defined at EU level. Where a content is illegal only in a given Member State, as a general rule it should only be removed in the territory where it is illegal.

Will the Digital Services Act replace sector specific legislation?

No. The Digital Services Act sets the horizontal rules covering all services and all types of illegal content, including goods or services. It does not replace or amend, but it complements sector-specific legislation such as the Audiovisual Media Services Directive (AVMSD), the Directive on Copyright in the Digital Single Market, the Consumer Protection Acquis, or the Proposal for a Regulation on preventing the dissemination of terrorist content online.

What are the current rules and why do they have to be updated?

The e-Commerce Directive, adopted in 2000, has been the main legal framework for the provision of digital services in the EU. It is a horizontal legal framework that has been the cornerstone for regulating digital services in the European single market.

Much has changed in 20 years and the rules need to be upgraded. Online platforms have created significant benefits for consumers and innovation, and have facilitated cross-border trading within and outside the Union and opened new opportunities to a variety of European businesses and traders. At the same time, they are abused for disseminating illegal content, or selling illegal goods or services online. Some very large players have emerged as quasi-public spaces for information sharing and online trade. They pose particular risks for users’ rights, information flows and public participation. In addition, the e-Commerce Directive did not specify any cooperation mechanism between authorities. The so-called Country-of-Origin principle meant that the supervision was entrusted to the country of establishment.

The Digital Services Act builds on the rules of the e-Commerce Directive, and addresses the particular issues emerging around online intermediaries. Member States have regulated these services differently, creating barriers for smaller companies looking to expand and scale up across the EU and resulting in different levels of protection for European citizens.

With the Digital Services Act, unnecessary legal burdens due to different laws will be lifted, fostering a better environment for innovation, growth and competitiveness, and facilitating the scaling up of smaller platforms, SMEs and start-ups. At the same time, it will equally protect all users in the EU, both as regards their safety from illegal goods, content or services, and as regards their fundamental rights.

What is the relevance of the Regulation of intermediaries at global level?

The new rules are an important step in defending European values in the online space. They respect international human rights norms, and help better protect democracy, equality and the rule of law.

The DSA sets high standards for effective intervention, for due process and the protection of fundamental rights online; it preserves a balanced approach to the liability of intermediaries, and establishes effective measures for tackling illegal content and societal risks online. In doing so, the DSA aims at setting a benchmark for a regulatory approach to online intermediaries also at the global level.

Will these rules apply to companies outside of the EU?

They apply in the EU single market, without discrimination, including to those online intermediaries established outside of the European Union that offer their services in the single market. When not established in the EU, they will have to appoint a legal representative, as many companies already do as part of their obligations in other legal instruments. At the same time, online intermediaries will also benefit from the legal clarity of the liability exemptions and from a single set of rules when providing their services in the EU.

Does the Digital Service Act include provisions for digital taxation?

No, the Commission’s proposal for an interim digital tax for revenue from digital activities is a separate initiative to the Digital Services Act. There are no provisions in the Digital Services Act in the field of taxation.

2.  Impact on users

How will citizens benefit from the new rules?

Online platforms play an increasingly important role in the daily lives of Europeans. The rules will create a safer online experience for citizens to freely express their ideas, communicate and shop online, by reducing their exposure to illegal activities and dangerous goods and ensuring the protection of fundamental rights.

Online marketplaces will need to identify their business users and clarify who is selling a product or offering a service; this will help track down rogue traders and will protect online shoppers against illegal products, such as counterfeit and dangerous products. Online marketplace will be required to inform consumers who purchased a product or service when they become aware of the illegality of such products or services, about a) the illegality, b) the identity of the trader and c) any relevant means of redress. They will randomly check the documentation of products sold on their platform, and should increasingly rely on enhanced product traceability solutions, to make sure fewer and fewer non-compliant goods reach European consumers.

At the same time, citizens will be able to notify illegal content, including products, that they encounter and contest the decisions made by online platforms when their content is removed: platforms are obliged to notify them of any decision taken, of the reason to take that decision and to provide for a mechanism to contest the decision.

They will also receive more information about ads they are seeing on online platforms – for example, if and why an ad targets them specifically. Platforms will no longer serve behaviourally targeted ads for minors and will no longer present ads to their users based on profiling that rests on special categories of personal data, such as their ethnicity, political views or sexual orientation.

Specific rules will be introduced for very large online platforms and very large online search engines that reach more than 45 million users, given their systemic impact in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas. When such platforms recommend content, users will be able to modify the criteria used, and choose not to receive personalised recommendations. Citizens will not have to take these companies at their word; citizens will be able to scrutinise their actions through the reports of independent auditors and vetted researchers.

Users will be able to seek compensation from providers of intermediary services for any damage or loss suffered due to an infringement of the DSA by such provider.

What measures does the legislation take to counter illegal content?

The Digital Services Act will set out effective means for all actors in the online ecosystem to counter illegal content, but also illegal goods and services.

Users will be empowered to report illegal content in an easy and effective way. A privileged channel will be created for trusted flaggers – entities which have demonstrated particular expertise and competence – to report illegal content to which platforms will have to react with priority. When enabled by national laws, Member State authorities will be able to order any platform operating in the EU, irrespective of where they are established, to remove illegal content.

Finally, very large online platforms will need to take mitigating measures at the level of the overall organisation of their service to protect their users from illegal content, goods and services.

How will the DSA protect people from unsafe or counterfeit goods?

The Digital Services Act will set out effective means for all actors in the online ecosystem to counter illegal goods. Platforms will have mandatory procedures in place for removing illegal goods. Online marketplaces will also be requested to trace their traders (“know your business customer”). This will ensure a safe, transparent and trustworthy environment for consumers and discourage traders who abuse platforms from selling unsafe or counterfeit goods. Online platforms will further be requested to organise their online interfaces in a way that allows traders to comply with their information obligations towards consumers. A new system of trusted flaggers will also be available, e.g. for brand owners fighting counterfeit goods, and for faster and easier flagging and removal of counterfeit goods. Public authorities will have new tools to order the removal of unsafe products directly. Marketplaces will also be required to implement reasonable efforts to randomly check whether products or services have been identified as being illegal in any official database and take the appropriate action. Very large online platforms will be subject to an audited risk assessment that will include an analysis on their vulnerability to illegal goods on their platforms, and their mitigation measures at this organisational level will also be subject to annual audits.

How will the DSA protect minors?

Under new rules, providers of online platforms that are accessible to minors will be required to put in place appropriate measures to ensure high level of privacy, safety and security of minors, on their services.

In addition, the new rules will ban targeted advertising to minors based on profiling using the personal data of users of their services when they can establish with reasonable certainty that the recipient of the service is a minor.

How can harmful but not illegal content be effectively addressed?

To the extent that it is not illegal, harmful content should not be treated in the same way as illegal content. The new rules will only impose measures to remove or encourage removal of illegal content, in full respect of the freedom of expression.

The DSA regulates platforms’ responsibilities when it comes to systemic issues such as disinformation, hoaxes and manipulation during pandemics, harms to vulnerable groups and other emerging societal harms. Very large online platforms and very large online search engines that reach 45 million users will have to assess and mitigate societal risks stemming from the design and use of their service.

In addition, the proposal sets out a co-regulatory framework where service providers can work under codes of conduct to address negative impacts regarding the viral spread of illegal content as well as manipulative and abusive activities, which are particularly harmful for vulnerable recipients of the service, such as children and minors.

The DSA will foster a co-regulatory framework for online harms, including codes of conduct such as a revised Code of Practice on disinformation, and crisis protocols.

How will you keep a fair balance with fundamental rights such as the freedom of expression?

The text puts protection of freedom of expression at its very core. This includes protection from government interference in people’s freedom of expression and information. The horizontal rules against illegal content are carefully calibrated and accompanied by robust safeguards for freedom of expression and an effective right of redress – to avoid both under-removal and over-removal of content on grounds of illegality.

The proposal gives users the possibility to contest the decisions taken by the online platforms to remove their content, including when these decisions are based on platforms’ terms and conditions. Users can complain directly to the platform, choose an out-of-court dispute settlement body or seek redress before Courts.

The Digital Services Act proposes rules on transparency of content moderation decisions. For very large platforms, users and consumers will be able to have a better understanding of the ways these platforms impact our societies and will be obliged to mitigate those risks, including as regards freedom of expression. They will be held accountable through independent auditing reports and specialised and public scrutiny.

All the obligations in the DSA, including the crisis response mechanism, are carefully calibrated to promote the respect of fundamental rights, such as freedom of expression.

How does the Digital Services Act tackle disinformation?

Through the proposed rules on how platforms moderate content, on advertising, algorithmic processes and risk mitigation, the DSA will aim to ensure that platforms – and in particular the very large ones – are more accountable and assume their responsibility for the actions they take and the systemic risks they pose, including on disinformation and manipulation of electoral processes.

The Digital Services Act will foster a co-regulatory framework, together with the updated Code of Practice on Disinformation and the new Commission Guidance, as announced in the European Democracy Action Plan.

How does the Digital Services Act regulate online advertising?

The Digital Services Act covers any type of advertising, from digital marketing to issues-based advertising and political ads and complements existing rules such as the General Data Protection Regulation, which already establishes, for example, rules on users’ consent or their right to object to targeted digital marketing.

The DSA introduces two new restrictions concerning targeted advertising on online platforms. First, it bans targeted advertising of minors based on profiling. Second, it bans targeted advertising based on profiling using special categories of personal data, such as sexual orientation or religious beliefs.

The new rules will empower users in understanding and making informed decisions about the ads they see. They will have to be clearly informed whether and why they are targeted by each ad and who paid for the ad; they should also see very clearly when content is sponsored or organically posted on a platform and should also see when influencers are promoting commercial messages. Notice and action obligations also apply for potentially illegal ads, as for any other type of content.

For very large online platforms, the societal stakes are higher, and the rules include additional measures to mitigate risks and enable oversight. They will have to maintain and provide access to ad repositories, allowing researchers, civil society and authorities to inspect how ads were displayed and how they were targeted. They will also need to assess whether and how their advertising systems are manipulated or otherwise contribute to societal risks, and take measures to mitigate these risks.

The rules are complemented by measures in the Digital Markets Act, which tackles the economic concerns over gatekeepers’ advertising models.

How does the Digital Services Act protect personal data?

The DSA has been designed in full compliance with existing rules on data protection, including the General Data Protection Regulation (GDPR) and the ePrivacy Directive, and does not modify the rules and safeguards set out in these laws.

How does the Digital Services Act address dark patterns?

Under new rules, ‘dark patterns’ are prohibited. Providers of online platforms will be required not to design, organise or operate their online interfaces in a way that deceives, manipulates or otherwise materially distorts or impairs the ability of users of their services to make free and informed decisions.

The ban complements, but does not overwrite the prohibitions already established under consumer protection and data protection rules, where a large numbers of dark patterns that mislead consumers are already banned in the EU.

3.  Impact on businesses

What digital services does the act cover?

The Digital Services Act applies to a wide range of online intermediaries, which include services such as internet service providers, cloud services, messaging, marketplaces, or social networks. Specific due diligence obligations apply to hosting services, and in particular to online platforms, such as social networks, content-sharing platforms, app stores, online marketplaces, and online travel and accommodation platforms. The most far-reaching rules in the Digital Services Act focus on very large online platforms, which have a significant societal and economic impact, reaching at least 45 million users in the EU (representing 10% of the population). Similarly, very large online search engines with more than 10% of the 450 million consumers in the EU will bear more responsibility in curbing illegal content online.

What impact will the Digital Services Act have on businesses?

The DSA modernises and clarifies rules dating back to the year 2000. It will set a global benchmark, under which online businesses will benefit from a modern, clear and transparent framework assuring that rights are respected and obligations are enforced.

Moreover, for online intermediaries, and in particular for hosting services and online platforms, the new rules will cut the costs of complying with 27 different regimes in the single market. This will be particularly important for innovative SMEs, start-ups and scale-ups, which will be able to scale at home and compete with very large players. Small and micro-enterprises will be exempted from some of the rules that might be more burdensome for them, and the Commission will carefully monitor the effects of the new Regulation on SMEs.

Other businesses will also benefit from the new set of rules. They will have access to simple and effective tools for flagging illegal activities that damage their trade, as well as internal and external redress mechanisms, affording them better protections against erroneous removal, limiting losses for legitimate businesses and entrepreneurs.

Furthermore, those providers which voluntarily take measures to further curb the dissemination of illegal content will be reassured that these measures cannot have the negative consequences of being unprotected from legal liability.

What impact will the Digital Services Act have on start-ups and innovation in general?

It will make the single market easier to navigate, lower the compliance costs and establish a level playing field. Fragmentation of the single market disproportionately disadvantages SMEs and start-ups wishing to grow, due to the absence of a large enough domestic market and to the costs of complying with many different legislations. The costs of fragmentation are much easier to bear for businesses, which are already large.

A common, horizontal, harmonised rulebook applicable throughout the Digital Single Market will give SMEs, smaller platforms and start-ups, access to cross-border customers in their critical growth phase. The rules are accompanied by standardisation actions and Codes of Conduct that should support a smooth implementation by smaller companies.

How will the proposed Digital Services Act differentiate between small and big players?

The proposal sets asymmetric due diligence obligations on different types of intermediaries depending on the nature of their services as well as on their size and impact, to ensure that their services are not misused for illegal activities and that providers operate responsibly. Certain substantive obligations are limited only to very large online platforms, which have a central role in facilitating the public debate and economic transactions. Very small platforms are exempt from the majority of obligations.

By rebalancing responsibilities in the online ecosystem according to the size of the players, the proposal ensures that the regulatory costs of these new rules are proportionate.

What impacts will the proposed Digital Services Act have on platforms and very large platforms?

All platforms, except the smallest, will be required to set up complaint and redress mechanisms and out-of-court dispute settlement mechanisms, cooperate with trusted flaggers, take measures against abusive notices, deal with complaints, vet the credentials of third party suppliers, and provide user-facing transparency of online advertising.

In addition, very large online platforms and very large online search engines, reaching at least 45 million users (i.e. representing 10% of the European population) are subject to specific rules due to the particular risks they pose in the dissemination of illegal content and societal harms.

Very large online platforms will have to meet risk management obligations, external risk auditing and public accountability, provide transparency of their recommender systems and user choice for access to information, as well as share data with authorities and researchers.

What penalties will businesses face if they do not comply with the new rules?

The new enforcement mechanism, consisting of national and EU-level cooperation, will supervise how online intermediaries adapt their systems to the new requirements. Each Member State will need to appoint a Digital Services Coordinator, an independent authority which will be responsible for supervising the intermediary services established in their Member State and/or for coordinating with specialist sectoral authorities. To do so, it will impose penalties, including financial fines. Each Member State will clearly specify the penalties in their national laws in line with the requirements set out in the Regulation, ensuring they are proportionate to the nature and gravity of the infringement, yet dissuasive to ensure compliance.

For the case of very large online platforms and very large online search engines, the Commission will have direct supervision and enforcement powers and can, in the most serious cases, impose fines of up to 6% of the global turnover of a service provider.

The enforcement mechanism is not only limited to fines: the Digital Services Coordinator and the Commission will have the power to require immediate actions where necessary to address very serious harms, and platforms may offer commitments on how they will remedy them.

For rogue platforms refusing to comply with important obligations and thereby endangering people’s life and safety, it will be possible as a last resort to ask a court for a temporary suspension of their service, after involving all relevant parties.

4.  Impact on Member States

How can the gap between laws in Member States be filled?

The experience and attempts of the last few years have shown that individual national action to rein in the problems related to the spread of illegal content online, in particular when very large online platforms are involved, falls short of effectively addressing the challenges at hand and protecting all Europeans from online harm. Moreover, uncoordinated national action puts additional hurdles on the smaller online businesses and start-ups who face significant compliance costs to be able to comply with all the different legislation. Updated and harmonised rules will better protect and empower all Europeans, both individuals and businesses.

The Digital Services Act will propose one set of rules for the entire EU. All citizens in the EU will have the same rights, a common enforcement system will see them protected in the same way and the rules for online platforms will be the same across the entire Union. This means standardised procedures for notifying illegal content, the same access to complaints and redress mechanisms across the single market, the same standard of transparency of content moderation or advertising systems, and the same supervised risk mitigation strategy where very large online platforms are concerned.

Which institutions will supervise the rules, and who will select them?

The supervision of the rules will be shared between the Commission – primarily responsible for platforms and search engines with more than 45 million users in the EU – and Member States, responsible for any smaller platforms according to the Member State of establishment.

The Commission will have the same supervisory powers as it has under current anti-trust rules, including investigatory powers and the ability to impose fines of up to 6% of global revenue.

Member States will be required to designate competent authorities – the Digital Services Coordinators – for supervising compliance of the services established on their territory with the new rules, and to participate in the EU cooperation mechanism of the proposed Digital Services Act. The Digital Services Coordinator will be an independent authority with strong requirements to perform their tasks impartially and with transparency. The new Digital Services Coordinator within each Member State will be an important regulatory hub, ensuring coherence and digital competence.

The Digital Services Coordinators will cooperate within an independent advisory group, called the European Board for Digital Services, which can support with analysis, reports and recommendations, as well as coordinating the new tool of joint investigations by Digital Services Coordinators.

What will the Commission’s role be in the supervision of platforms?

The enforcement of the proposed Digital Services Act for providers of intermediary services established on their territory is primarily a task for national competent authorities, notably the Digital Services Coordinators.

However, when it comes to supervision of very large online platforms and online search engines, it will be the Commission who will be the sole authority to supervise and enforce the specific obligations under the DSA that apply only to these providers. In addition, the Commission will be, together with the Digital Services Coordinators, also responsible for supervision and enforcement for any other systemic issue concerning very large online platforms and very large online search engines.

An important part of the supervisory and enforcement framework under the DSA will also be the Board, whose members will be independent Digital Services Coordinators.

How will the Commission finance costs associated with the new supervisory and enforcement competences?

In order to ensure effective compliance with the DSA, it is important that the Commission has at its disposal necessary resources, in terms of staffing, expertise, and financial means, for the performance of its tasks under this Regulation. To this end, the Commission will charge supervisory fees on such providers, the level of which will be established on an annual basis. The overall amount of annual supervisory fees charged will be established on the basis of the overall amount of the costs incurred by the Commission to exercise its supervisory tasks under this Regulation, as reasonably estimated beforehand.

The annual supervisory fee to be charged to providers of very large online platforms and search engines should be proportionate to the size of the service as reflected by the number of its recipients in the Union. To this end, the individual annual supervisory fee should not exceed an overall ceiling for each provider of very large online platforms and very large online search engines, taking into account the economic capacity of the provider of the designated service or services. Such ceiling is set at 0.05% of the annual worldwide net income of the provider concerned.

When will the DSA start applying?

The rules will start applying in two steps:

The DSA will be directly applicable across the EU fifteen months after entry into force (that is 20 days after publication in the Official Journal following the final adoption of the text), or from 1 January 2024, whichever is later. By then, Member States need to empower their national authorities to enforce the new rules on smaller platforms and rules concerning non-systemic issues on very large online platforms and very large online search engines.

For very large online platforms and very large online search engines, which are directly supervised by the Commission as regards systemic obligations, the new rules will kick in earlier: Once designated by the Commission, providers of very large platforms and very large online search engines have four months to comply with the DSA. Designation by the Commission takes place on the basis of user numbers reported by these services providers, which service providers will have three months after entry into force of the DSA to provide.

*Updated on 20/05/2022

Privacy Preferences
When you visit our website, it may store information through your browser from specific services, usually in form of cookies. Here you can change your privacy preferences. Please note that blocking some types of cookies may impact your experience on our website and the services we offer.