Understanding the EU’s Digital Services Act Requirements

Blog
May 8, 2023
Digital services act: entrepreneur standing by a window while using her phone

"You can’t have facts. You can’t have truth. You can’t have trust. How can you have a democracy if you do not have integrity of facts?” That was Maria Ressa, 2021's Nobel Peace Prize winner. One might think she was describing life in some banana republic, but she was, in fact, describing social media and its spread of disinformation.

Disinformation — false information that is intended to deceive — and illegal content online pose grave threats to many human fundamental rights. In response, in 2022, the European Union passed a pioneering new law called the Digital Services Act to protect its citizens’ fundamental rights. In this article, find out what this act is, what its impacts on technology platforms are, and how companies can comply with it.

What Is the Digital Services Act?

The Digital Services Act (DSA) is a European Union law designed to create safe online ecosystems in which the fundamental rights of all EU users of digital services are protected. The EU’s Digital Services Act aims to:

  • Protect the fundamental rights of users online by ensuring that online intermediaries act responsibly and diligently
  • Ensure a safe and trustworthy online environment by protecting users from online harms, dark patterns, and other cybersecurity risks
  • Control the dissemination of illegal content online
  • Mitigate the societal risks of the dissemination of disinformation and illegal content in digital spaces
  • Facilitate a single market and common regulatory zone for online services across the EU, instead of having 27 separate markets for each of the EU member states 

The DSA, along with the Digital Markets Act (DMA), add new rules to the older e-Commerce Directive to bring it up to date with modern online challenges. The DMA ensures transparency and accountability from large platforms that are in a position to effectively gatekeep smaller online businesses. These two EU laws were adopted by the European Parliament in Brussels in 2022. The European Commission has been enforcing it since November 2022.

Who Is Impacted by the DSA?

Digital services act: developer looking at some code

The Digital Services Act applies to any intermediary services that provide mere conduit over a network, caching, or hosting services. Some examples of such intermediary services are:

  • Social media platforms, like Facebook
  • App stores, like Apple Store
  • Online marketplaces, like Amazon
  • Hosting service providers
  • Cloud service providers, like Microsoft Azure

A service may be based in any of the 27 member states of the EU or outside of them, but it must comply with the DSA if it provides services to any users in the EU.

Key Transparency Obligations

Obligations under the DSA
Obligations under the DSA (Source: The European Commission, licensed under CC by 4.0)

The DSA prescribes various obligations on online intermediary services as shown above. Some of the key obligations are explained below.

Require Transparency Around Online Advertising

The DSA strives to protect users and minors from data profiling and the unauthorized use of personal data for online advertising. Its rules are as follows:

  • Online platforms must prepare a code of conduct and follow due diligence regarding their advertisers and other entities in the advertising value chain.
  • Targeted advertising to minors is prohibited.
  • Advertising based on profiling of ethnicity, political views, and similar factors is prohibited.

Improve Content Moderation

Each EU member state is free to define what constitutes illegal content. The DSA just standardizes the workflows for detecting, reporting, and acting on illegal content transparently. It recommends:

  • Easy reporting for users: Users must be able to easily report illegal content and get updates on moderation actions.
  • Accountability to users: Users are guaranteed a fundamental right to expression. They must be notified when their content is restricted or removed, and they can challenge the platform's moderation decisions.
  • Use of trusted flaggers: Trusted flaggers are entities (never individuals) with expertise in illegal content, like government or nongovernmental agencies that specialize in analyzing terrorist or child abuse content. Digital platforms must provide priority channels for them to report illegal content and make it a high priority to take down such content.

Simplify and Explain Terms and Conditions

Intermediaries must make simplified explanations for all their terms and conditions in all of the EU languages accessible to everyone, especially minors.

Publish Transparency Reports

The DSA requires all intermediaries to publish the following reports:

  • Annual reports on their content moderation, including use of automated tools
  • Orders from member states to remove illegal content
  • The average number of monthly active users of their services in the EU
  • Number of notices from trusted flaggers and the actions taken

Make Content Recommendations Transparent

The DSA obligates an online platform that uses a recommender algorithm to explain its main parameters and how users can influence those parameters to change their content recommendations. Platforms must also allow users to customize their recommender systems.

Provide Points of Contact

All intermediaries must provide points of contact for national authorities, the European Commission, the European Board for Digital Services, and for users of the service. Companies outside the EU providing intermediary services in the EU must designate legal representatives.

Obligations on Very Large Platforms and Search Engines

Person working at a cafe

The DSA recognizes that very large online platforms (VLOPs) and very large online search engines (VLOSEs), those with more than 45 million average monthly users, face greater systemic risks due to their larger reach in society. The act focuses on four categories of systemic risks:

  • Dissemination of illegal content and conduct of illegal activities like trade in prohibited goods
  • Impact of the service on fundamental rights like human dignity, freedom of expression, privacy, nondiscrimination, consumer protection, and rights of children
  • Negative effects on democratic and electoral processes
  • Adverse impacts on the protection of public health, minors, physical and mental well-being, and protection against gender-based violence and disinformation

So these VLOPs and VLOSEs must comply with additional obligations:

  • They must conduct risk assessments for the above systemic risks.
  • They must design plans and risk controls for active mitigation of these systemic risks.
  • They must have oversight of their risk controls through independent audits.
  • Their risk controls must not restrict fundamental rights like freedom of expression.

How Does the DSA Relate to the GDPR?

The General Data Protection Regulation (GDPR) is the EU's primary regulation for data protection, designed especially to protect the personal data of users. Users have the right to review the data that platforms are storing on them and to demand the removal of any data.

The DSA does not override or impact the GDPR in any way. The two are complementary, and companies must comply with both.

Enforcement and Penalties

The enforcement structure consists of the following:

  • Digital services coordinators: Each EU member state appoints one of its competent national authorities as the digital services coordinator (DSC) who is responsible for the supervision of intermediaries and enforcement in that state.
  • European Board for Digital Services: It consists of DSCs from all member states and is responsible for advising and coordinating regulatory activities.

The penalties for noncompliance include:

  • A fine of up to 6% of the company’s annual turnover for a failure to comply with any obligation.
  • A fine of up to 1% of the company’s annual turnover for incomplete, incorrect, or misleading information in reports.
  • A fine, per day, of up to 5% of the daily turnover for recurring penalties.

Compliance Timeline

Important compliance dates are listed below:

  • November 16, 2022: The DSA came into force.
  • February 17, 2023: By this date, all entities must have reported their number of active users in the EU.
  • February 17, 2024: The DSA will apply to all regulated entities. All member states must have established DSCs.
  • February 17, 2023–June 17, 2023 (four months): VLOPs and VLOSEs must comply with all obligations of the DSA, conduct their first annual risk assessments, and submit them to the European Commission in these four months.
Share on Social

Understanding the EU’s Digital Services Act Requirements

Blog
May 8, 2023
Compliance
Best Practices
May 8, 2023
Digital services act: entrepreneur standing by a window while using her phone

"You can’t have facts. You can’t have truth. You can’t have trust. How can you have a democracy if you do not have integrity of facts?” That was Maria Ressa, 2021's Nobel Peace Prize winner. One might think she was describing life in some banana republic, but she was, in fact, describing social media and its spread of disinformation.

Disinformation — false information that is intended to deceive — and illegal content online pose grave threats to many human fundamental rights. In response, in 2022, the European Union passed a pioneering new law called the Digital Services Act to protect its citizens’ fundamental rights. In this article, find out what this act is, what its impacts on technology platforms are, and how companies can comply with it.

What Is the Digital Services Act?

The Digital Services Act (DSA) is a European Union law designed to create safe online ecosystems in which the fundamental rights of all EU users of digital services are protected. The EU’s Digital Services Act aims to:

  • Protect the fundamental rights of users online by ensuring that online intermediaries act responsibly and diligently
  • Ensure a safe and trustworthy online environment by protecting users from online harms, dark patterns, and other cybersecurity risks
  • Control the dissemination of illegal content online
  • Mitigate the societal risks of the dissemination of disinformation and illegal content in digital spaces
  • Facilitate a single market and common regulatory zone for online services across the EU, instead of having 27 separate markets for each of the EU member states 

The DSA, along with the Digital Markets Act (DMA), add new rules to the older e-Commerce Directive to bring it up to date with modern online challenges. The DMA ensures transparency and accountability from large platforms that are in a position to effectively gatekeep smaller online businesses. These two EU laws were adopted by the European Parliament in Brussels in 2022. The European Commission has been enforcing it since November 2022.

Who Is Impacted by the DSA?

Digital services act: developer looking at some code

The Digital Services Act applies to any intermediary services that provide mere conduit over a network, caching, or hosting services. Some examples of such intermediary services are:

  • Social media platforms, like Facebook
  • App stores, like Apple Store
  • Online marketplaces, like Amazon
  • Hosting service providers
  • Cloud service providers, like Microsoft Azure

A service may be based in any of the 27 member states of the EU or outside of them, but it must comply with the DSA if it provides services to any users in the EU.

Key Transparency Obligations

Obligations under the DSA
Obligations under the DSA (Source: The European Commission, licensed under CC by 4.0)

The DSA prescribes various obligations on online intermediary services as shown above. Some of the key obligations are explained below.

Require Transparency Around Online Advertising

The DSA strives to protect users and minors from data profiling and the unauthorized use of personal data for online advertising. Its rules are as follows:

  • Online platforms must prepare a code of conduct and follow due diligence regarding their advertisers and other entities in the advertising value chain.
  • Targeted advertising to minors is prohibited.
  • Advertising based on profiling of ethnicity, political views, and similar factors is prohibited.

Improve Content Moderation

Each EU member state is free to define what constitutes illegal content. The DSA just standardizes the workflows for detecting, reporting, and acting on illegal content transparently. It recommends:

  • Easy reporting for users: Users must be able to easily report illegal content and get updates on moderation actions.
  • Accountability to users: Users are guaranteed a fundamental right to expression. They must be notified when their content is restricted or removed, and they can challenge the platform's moderation decisions.
  • Use of trusted flaggers: Trusted flaggers are entities (never individuals) with expertise in illegal content, like government or nongovernmental agencies that specialize in analyzing terrorist or child abuse content. Digital platforms must provide priority channels for them to report illegal content and make it a high priority to take down such content.

Simplify and Explain Terms and Conditions

Intermediaries must make simplified explanations for all their terms and conditions in all of the EU languages accessible to everyone, especially minors.

Publish Transparency Reports

The DSA requires all intermediaries to publish the following reports:

  • Annual reports on their content moderation, including use of automated tools
  • Orders from member states to remove illegal content
  • The average number of monthly active users of their services in the EU
  • Number of notices from trusted flaggers and the actions taken

Make Content Recommendations Transparent

The DSA obligates an online platform that uses a recommender algorithm to explain its main parameters and how users can influence those parameters to change their content recommendations. Platforms must also allow users to customize their recommender systems.

Provide Points of Contact

All intermediaries must provide points of contact for national authorities, the European Commission, the European Board for Digital Services, and for users of the service. Companies outside the EU providing intermediary services in the EU must designate legal representatives.

Obligations on Very Large Platforms and Search Engines

Person working at a cafe

The DSA recognizes that very large online platforms (VLOPs) and very large online search engines (VLOSEs), those with more than 45 million average monthly users, face greater systemic risks due to their larger reach in society. The act focuses on four categories of systemic risks:

  • Dissemination of illegal content and conduct of illegal activities like trade in prohibited goods
  • Impact of the service on fundamental rights like human dignity, freedom of expression, privacy, nondiscrimination, consumer protection, and rights of children
  • Negative effects on democratic and electoral processes
  • Adverse impacts on the protection of public health, minors, physical and mental well-being, and protection against gender-based violence and disinformation

So these VLOPs and VLOSEs must comply with additional obligations:

  • They must conduct risk assessments for the above systemic risks.
  • They must design plans and risk controls for active mitigation of these systemic risks.
  • They must have oversight of their risk controls through independent audits.
  • Their risk controls must not restrict fundamental rights like freedom of expression.

How Does the DSA Relate to the GDPR?

The General Data Protection Regulation (GDPR) is the EU's primary regulation for data protection, designed especially to protect the personal data of users. Users have the right to review the data that platforms are storing on them and to demand the removal of any data.

The DSA does not override or impact the GDPR in any way. The two are complementary, and companies must comply with both.

Enforcement and Penalties

The enforcement structure consists of the following:

  • Digital services coordinators: Each EU member state appoints one of its competent national authorities as the digital services coordinator (DSC) who is responsible for the supervision of intermediaries and enforcement in that state.
  • European Board for Digital Services: It consists of DSCs from all member states and is responsible for advising and coordinating regulatory activities.

The penalties for noncompliance include:

  • A fine of up to 6% of the company’s annual turnover for a failure to comply with any obligation.
  • A fine of up to 1% of the company’s annual turnover for incomplete, incorrect, or misleading information in reports.
  • A fine, per day, of up to 5% of the daily turnover for recurring penalties.

Compliance Timeline

Important compliance dates are listed below:

  • November 16, 2022: The DSA came into force.
  • February 17, 2023: By this date, all entities must have reported their number of active users in the EU.
  • February 17, 2024: The DSA will apply to all regulated entities. All member states must have established DSCs.
  • February 17, 2023–June 17, 2023 (four months): VLOPs and VLOSEs must comply with all obligations of the DSA, conduct their first annual risk assessments, and submit them to the European Commission in these four months.
expand icon

expand icon

expand icon

Certa Helps You With DSA Compliance

Based on the list of new obligations, it's obvious that the DSA will have wide-ranging impacts on the online ecosystem in the EU. Users can look forward to more transparency and accountability, especially from the large platforms that have hitherto not been answerable to anybody. Companies with users in the EU can improve their images in users' eyes if they're willing to comply with the DSA in good faith.

Certa's risk management and reporting features enable you to effectively comply with the Digital Services Act. With Certa, you can:

  • Conduct comprehensive risk assessments for the DSA's four systemic risks
  • Implement risk controls as automated workflows built with Certa's no-code Studio
  • Generate the annual transparency reports in machine-readable formats, as required by the DSA
  • Maintain your terms and conditions and their explanations in Certa's central repository for easy tracking and publishing
  • Enable trusted flaggers to generate reports on notices as mandated by the DSA
  • Integrate Certa's workflows with your user management systems to automatically publish monthly active user reports for DSA compliance

To learn more about how you can use Certa for DSA compliance, talk to our experts today.