Everything you wanted to know about the Digital Services Act; but were afraid to ask!

Can Şimşek
15 min readDec 19, 2020
Uncle Ben and the EU Commissioners
  1. Introduction

The European Commission finally presented the prospective Digital Services Act (DSA) and Digital Markets Act (DMA) which include new obligations for the internet platforms. According to the drafts, the DSA and DMA introduce 6% and 10% potential fines respectively, based on the global annual turnover of the companies. Although the Commissioners deliberately avoid mentioning names, the objective criteria in the regulations cover the big platforms which play a gatekeeping role in the digital economy; namely platforms like Google, Amazon, Facebook, Apple etc. Indeed, while announcing the proposals, Thierry Breton, the European Commissioner for internal market and Margrethe Vestager, the executive vice president of the European Commission for a Europe Fit for Digital Age, kept repeating that the regulations are not targeting any firm but “with bigger size, comes bigger responsibility”; as if they were trying to remind American tech giants the “Peter Parker principle” which was popularized by the wise words of Uncle Ben from the famous American comic book Spiderman. Of course, the proposed regulations will go through a long and complicated ratification process, with the EU Member States, the European Parliament, and certainly the lobbyists and trade associations influencing their final versions. During the press conference on 15th of December, Vestager optimistically estimated that the deliberation processes of the proposed Regulations are likely to take more than a year besides another six months for their implementations. Although the DMA gathered more attention from the press as enabling higher fines and serious competition policy tools which might change the structure of internal market, in fact, DSA brings yet more interesting new obligations than DMA because it can actually change the structure of online realm itself. With this in mind, I tried to provide a summary of the proposed Digital Services Act and a brief commentary on its initial draft in this article.

2. What is the Digital Services Act?

2.1 The Legal Status of the DSA

The Digital Services Act is a proposed European Regulation which concerns online platforms. The legal basis for the proposal is Article 114 of the Treaty on the Functioning of the European Union, which provides for the establishment of measures to ensure the functioning of the Internal Market. In general, European Regulations aim to achieve uniformity in the national laws addressing the issue at stake since a Regulation is a binding legislative act of the EU and it is directly applicable in each Member State whereas a Directive forms a part of the secondary law in EU legal framework which sets out goals and rules that need to be transferred into national laws by States in a way that they deem appropriate. Within this Act, Articles 12 to 15 of the E-Commerce Directive are foreseen to be deleted since the same issues are being addressed within the new Regulation.

2.2 The Scope of the DSA

Although the very first Article of the draft writes that the Regulation “lays down harmonised rules on the provision of intermediary services in the internal market”, the territorial scope of the Regulation corresponds to the digital borders of the EU rather than the physical borders of it. The scope of the obligations that the Regulation puts forward includes all players offering goods, information or services in the Union, regardless of their place of establishment. As for the material scope, the Regulation sets up a framework for the conditional exemption from liability of providers of intermediary services as well as rules on specific due diligence obligations tailored to certain specific categories of service providers. The first article also declares the aims of the Regulation as to “contribute to the proper functioning of the internal market for intermediary services and to set out uniform rules for a safe, predictable and trusted online environment”, where fundamental rights enshrined in the European Charter are effectively protected. Consequently, it applies to all the digital service providers as a horizontal Regulation.

3. Obligations that the DSA Lays Down

3.1 General Obligations

As Thierry Breton rightfully remarked during the press conference introducing the proposals, the DSA does not regulate online content but it introduces obligations for the service providers. According to the draft, service providers acting as mere conduits will not be liable for the information transmitted, similar to the old regime established under the E-Commerce Directive. The same applies to caching and hosting activities conditional to some reasonable criteria (Articles 4 and 5). In order to remain as a mere conduit, the platforms should not initiate the transmission; should not modify the information nor select the receiver. On the other hand, digital service providers are obliged to act against the illegal content (article 8) and provide information about the recipients of the service (Article 9) once they receive an order, issued from a relevant national administrative or judicial authority. They will also be obliged to document these within the “transparency reports” in relation to the removal and the disabling of information considered to be illegal content or contrary to the providers’ “terms and conditions” (Article 13).

As for the hosting services, the proposed legislation requires service providers to put in place mechanisms that allow third parties to notify the presence of alleged illegal content (Article 14) and further obliges the online platforms to ensure that notices submitted by entities that are granted the status of “trusted flaggers” are treated with priority (Article 19). In this context, the platform will be obliged to inform the users with a “statement of reason” in case it decides to remove or disable access to content provided by that user (Article 15).

The DSA sets out the measures for online platforms to adopt against misuse like suspension of the accounts (Article 20). The platforms will be also required to inform competent enforcement authorities in the event they become aware of any information giving rise to a suspicion of serious criminal offenses involving a threat to the life or safety of persons (Article 21).

Moreover, the Regulation lays down sort of due process obligations applicable to all online platforms except the ones that are micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC (Article 16). Accordingly, online platforms should provide an internal complaint-handling system in respect of decisions taken on alleged illegal content or information incompatible with their terms and conditions, at least for 6 months following such a decision (Article 17). Most peculiarly, online platforms are obliged to engage with certified “out-of-court dispute settlement bodies” whose decisions will be binding (Article18).

Momentously, all service providers are obliged to establish a single point of contact (Article 10) whereas providers which do not have an establishment in the Union but which offer services in the Union are obliged to designate, a legal or natural person as their legal representative in one of the Member States (Article 11)

In terms of e-commerce platforms, the Regulation harbors many novelties. Following Article 22, online platforms are obliged to receive, store, make a reasonable effort to assess the reliability of and publish specific information on the traders using their services as well as organizing their interfaces in a way that enables traders to respect Union consumer and product safety law. Besides these, the e-commerce platforms will also be required to publish information on the removal and disabling of information considered to be illegal or contrary to their terms and conditions (Article 23)

Lastly, Article 24 requires transparency in the context of online advertising, including the obligation to display information on whose behalf the advertisement is being shown and why it is being shown to the particular service recipient. For this purpose, sub-paragraph (b) obliges the platforms to provide “meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed”.

3.2 Additional Obligations for the “Very Large Platforms”

Apparently, the Commission took a layered approach in the proposed regulation by introducing additional obligations for the very large online platforms with a significant reach in the Union in order to tackle the systemic risks they bring in proportion to their sizes. The threshold for service providers in scope of these obligations is currently estimated to be amounting to more than 45 million recipients of the service. This threshold is planned to be adjusted by the Commission in a way that it consistently corresponds to 10% of the Union’s population.

Differing from the other platforms, “very large online platforms” are obliged to conduct risk assessments on the systemic risks brought about by or relating to the functioning and use of their services (Article 26) as well as taking reasonable and effective measures aimed at mitigating those risks (Article 27). Furthermore, they should submit themselves to external and independent audits at least once a year (Article 28). In addition to the general transparency requirements, very large platforms have to make the information regarding the advertisement they display publicly available through a repository (Article 30). By the same token, they will be subject to specific and stricter transparency reporting obligations (Article 33). Cardinally, they should also share information about the “recommender systems” they use and give options to the service recipients if possible (Article 29).

The Regulation also contains provisions concerning due diligence obligations, namely the processes for which the Commission will support and promote the development and implementation of harmonised European standards (Article 34); the framework for the development of codes of conduct (Article 35); and the framework for the development of specific codes of conduct for online advertising (Article 36).

Remarkably, very large platforms will be obliged to appoint -one or more- compliance officers to ensure compliance with the obligations laid down in the Regulation (Article 32) and they will be obliged to provide access to data, to the Digital Services Coordinator of establishment, the Commission and vetted researchers (Article 31). Last but not least, there is also a provision on crisis protocols to address extraordinary circumstances affecting public security or public health (Article37).

3.3 Provisions Concerning the Implementation and Enforcement

Chapter IV of the Regulation lays down provisions concerning the implementation and enforcement of the Regulation. For starters, Member States should designate primary national authorities as independent and impartial “Digital Services Coordinators” (DSCs) that are granted specific powers to ensure the consistent application of the Regulation.(Articles 38, 39, 41). DSCs can receive complaints against services providers for breaches of the obligations (Article 43). They are required to publish annual reports on their activities (Article 44) and to cooperate with DSCs of other Member States (Article 45). They can also participate in joint investigations (Article 46). According to the Articles 47, 48 and 49, the European Board for Digital Services, an independent advisory group of Digital Services Coordinators, will be established.

In case the service provider has a main establishment within the Union, the Member State in which the establishment resides will have jurisdiction to enforce the Regulation. If not, the Member State where a service provider (who is not established in the EU) appoints a legal representative will have jurisdiction, considering the function of legal representatives under this Regulation. In this regard, the Recital 76 further iterates Article 40 by stating that “all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected”. Most importantly, it is up to Member States to lay down the rules on penalties (not exceeding the 6% of the annual turnover of the service provider) applicable to breaches of the obligations under this Regulation (Article 42).

It seems that the Regulation also offers a stratified framework in terms of supervision, investigation, enforcement and monitoring in a way that it is rather rigorous when it comes to very large online platforms. It provides for “enhanced supervision” in the event such platforms infringe the provisions of Chapter III of the Section 4 (Article 50) as well as the possibility for the Commission to intervene in case the infringements persist (Article 51). In such situations, the Commission can carry out investigations, including through requests for information (Article 52), interviews (Article 53) and on-site inspections (Article 54). It can adopt interim measures (Article 55) and make the commitments given by very large online platforms “binding” (Article 56), as well as monitor their compliance with the Regulation (Article 57). If very large online platforms are not complying with the decisions, the Commission can adopt non-compliance decisions (Article 58), issue fines up to the 6% of their annual turnover (Article 59) and impose periodic penalty payments (Article 60).

One last thing worth mentioning is that the recipients of intermediary services have the right to mandate a body, organisation or an association to exercise their rights on their behalf (article 68). Lastly, the Regulation sets the procedural guarantees in front of the Commission, in particular the right to be heard, right of access to the file (Article 63) and the publication of decisions (Article 64) besides a limitation period for the imposition of penalties (Article 61) and for their enforcement (Article 62).

4. Commentary

4.1 Legitimacy of new ex-ante Regulations

The idea of introducing ex-ante regulations which apply to the digital service providers in a horizontal manner was on the table for a pretty long time. In course of the 20 years passed from the adoption of the E-Commerce Directive which exempted the online intermediaries from legal liability, the topology of internet changed drastically and some online platforms became crucial for almost every domain of the daily life from commerce to politics. Although the “gatekeeper platforms” have been offering great services to their users, they have also been rather reckless while handling such great power when it comes to the personal data they gather and illegal content circulating online. Moreover, their business models circumvented existing legal frameworks and some of them disrupted markets with their power by favoring their own products. On the market disruption side, the European Commission already tried to take action against some of the tech giants with several competition law probes under the leadership of Margrethe Vestager who is actually one of the co-drafters of DSA and DMA, with a view that existing competition tools need to be supported by Regulations.[1]

As for the rather political side, the European leaders have been aware for some time that internet platforms were game changers. Events like the Cambridge Analytica/Facebook scandal have gathered a lot of attention on this issue, casting doubts on whether the personal data of European citizens are in safe hands. The rise of the terrorist attacks in Europe was another issue which captured eyeballs and made people question the illegal content circulating online. In parallel to these concerns, European leaders have been rightfully calling for the overarching goal of “European digital sovereignty” for some time now. For one, the French President Emmanuel Macron, a very vocal defender of the European sovereignty, has been emphasizing the need for “fair regulation of digital space”.[2]

In accordance with the emerging consensus on the inefficiency of the soft-law approach and self-regulation, the European Commission finally stepped in. Needless to say, being a strong supporter of liberal market economy, the EU never underlined the origin of these firms as the main problem. Instead of directly targeting the American tech giants by breaking them up or taking protectionist measures, the EU decided to address these issues by regulating. Thus, the DSA and DMA are legitimate responses to pressing legitimate policy goals in addition to the General Data Protection Regulation (GDPR). Indeed, given the fact that Covid-19 pandemic made online platforms an even more crucial part of our daily lives than they already were, it was about time that we discuss their responsibilities.

4.2 The DSA and the GDPR

If one leg of the DSA is responsibility, the other is transparency. One needs both of them to run, so to speak. In fact, transparency is not a new principle in the realm of regulating digital technologies. In fact, Article 5 of the GDPR requires that the personal data shall be processed lawfully, fairly and in a transparent manner in relation to the data subject.[3] However, the transparency principle has a rather limited reach within the GDPR framework and it is often interpreted in respect of purposes of the data processing instead of the information about the means of processing due to intellectual property and commercial secrecy concerns. Thus, scholars have been discussing this issue and calling for further transparency obligations which would clarify the transparency obligations regarding the algorithms.[4] In this respect, the substance of the DSA is complementing the shortcomings of the GDPR framework. For instance, after the Cambridge Analytica revelations, European Data Protection Board, which was established under the GDPR, stated that the use of “sophisticated profiling techniques to monitor and target voters and opinion leaders” raise concerns which surpass a mere privacy and data protection issue and threaten “the trust in the integrity of the democratic process”.[5] Yet, national data protection authorities lacked the necessary means and legal authority to tackle this issue effectively. Establishing a similar enforcement model to the GDPR, DSA offers to establish independent “digital service coordinators” -like the national data protection authorities- which will be granted the necessary powers this time. Besides this, as mentioned in the third chapter of this article, very large platforms will be subject to stricter transparency reporting obligations. Furthermore, they will be obliged to share information about “recommender systems” they use and give options to the service recipients if possible. Lastly, they will need to submit themselves to external independent audits. By the look of things, these rules in the draft Regulation are huge steps towards establishing algorithmic transparency.

4.3 Legal Implications of the DSA on the Responsibility of Platforms

As mentioned above, the EU has been governing the liability of online intermediaries by a regulatory framework which contains a set of horizontal liability exemptions for intermediaries which exempts them from dealing with third party content in order to not hamper the development of e-commerce. In principle, intermediaries may not be obliged to monitor their service in a general manner in order to detect and prevent the illegal activity of their users according to the EU Law. The jurisprudence of the Court of Justice of the European Union (CJEU) has further clarified this general monitoring ban, including its anchorage in the European Charter, in particular the right to the protection of personal data (Article 8), the freedom of expression and information (Article 11), the freedom to conduct a business and the free movement of goods and services in the internal market (Article 16).[6] As being part of the primary law, these legal guarantees constitute the basis for the application of the general monitoring prohibition in secondary EU legislation, namely Article 15(1) of the E-Commerce Directive and Article 17(8) of the Directive on Copyright in the Digital Single Market. Thus, even though the introduction of the DSA erases the Article 15 of the E-Commerce Directive, the CJEU would need to balance the new obligations against these legal safeguards. As for the concerns about the implications on freedom of expression, the DSA seems to strike a fair balance by limiting its scope with “legality”. Accordingly, the platforms will be required to cooperate in order to tackle illegal content such as child sexual abuse material, misuse of their platforms that impinges on fundamental rights, and intentional manipulation of platforms, such as using bots to influence elections and public health. On the other hand, hate-speech will only be covered in proportion to its legality. Thus, the national courts or competent authorities will need to act very meticulously to make sure that they are safeguarding fundamental rights instead of infringing them. For this purpose, the DSA also foresees a transparency reporting system in relation to the removal and the disabling of information considered to be illegal content or contrary to the providers’ “terms and conditions”.

5. Conclusion

The proposed DSA is an ex-ante Regulation which aims to create a safe and transparent online environment by unifying the legal framework within the European internal market; which is a very legitimate policy goal also given the increasing importance of the digital platforms in the context of the Covid-19 pandemic. With the changes it harbors in terms of transparency and responsibility of the online platforms, it might have a great impact on the future of the internet if it is adopted. As seen from the draft, the Commission seems to strike a rather fair balance between different legitimate interests. Yet, the final version of the Regulation remains to be seen. One thing is certain, during the “trilogues”, the law makers will be discussing a lot on the articles concerning enforcement. After all, a Regulation is only as strong as its enforcement.

Bibliography

[1]Natasha Lomas, Europe to limit how big tech can push its own services and use third-party data. Avaiable here: https://techcrunch.com/2020/10/29/europe-to-limit-how-big-tech-can-push-its-own-services-and-use-third-party-data/ (access December 2020)

[2] See; France and Germany Must Build EU Sovereignty — Macron’, France in the United Kingdom — La France Au Royaume- Uni. Available here: https://uk.ambafrance.org/France-and-Germany-must-build-EU-sovereignty-Macron> (accessed December 2020)

[3] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation).

[4]See inter alia: Mireille Hildebrandt, The Dawn of a Critical Transparency Right for the Profiling Era, DIGITAL ENLIGHTENMENT Y.B. 41 (2012), Tal Z. Zarsky, Transparent Predictions, 2013 U. ILL. L. REV. 1503 (2013), Danielle Keats Citron & Frank Pasquale, The Scored Society: Due Process for Automated Predictions, 89 WASH. L. REV. 1 (2014), Bryan Casey, Ashkon Farhangi & Roland Vogl, Rethinking Explanable Machines: the GDPR’s “Right to Explanation” Debate and the Rise of Algorithmic Audits in Enterprise, 34 Berkeley Tech. Law Journal 145 (2019). Margot E. Kaminski, The Right to Explanation, Explained, Berkeley Tech. Law Journal 189 (2019).

[5] EDPB, Statement 2/2019 on the use of personal data in the course of political campaigns (2019). Available here: https://edpb.europa.eu/sites/edpb/files/files/file1/edpb-2019-03-13-statement-on-elections_en.pdf (accessed December 2020)

[6] See; Senftleben, M. R. F., & Angelopoulos, C. J. (2020). The Odyssey of the Prohibition on General Monitoring Obligations on the Way to the Digital Services Act: Between Article 15 of the E-Commerce Directive and Article 17 of the Directive on Copyright in the Digital Single Market. Available here: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3717022 (access December 2020)

--

--