Takeaways from the webinar ‘Delimiting Systemic Risks in the DSA’
On Thursday 16th May 2024, the new Sciences Po Law School research group DSA Decoded convened a launch event bringing together experts from Europe and beyond to discuss the challenges and opportunities of the systemic risk framework established by Articles 34-35 of the EU’s Digital Services Act (DSA). This framework applies to ‘very large online platforms and search engines’ (VLOPs) with over 45 million EU users. It requires them to monitor and address a number of broadly-defined ‘systemic risks’ in areas such as public health and security, fundamental rights, electoral integrity and civic discourse.
DSA Decoded is an initiative funded by the Project Liberty Institute, which aims to monitor and critically evaluate how European regulators will implement this risk management regime. It also seeks to position itself as a key resource for regulators and different stakeholders participating in this process. Indeed, the DSA foresees an enforcement structure where academic researchers and civil society organisations will play an essential role in identifying and defining systemic risks, as well as holding platform companies accountable for effectively mitigating them.
This first meeting, held under Chatham House rules, was invite-only, with approximately 20 attendees, including academics and regulators. The event consisted of two panels of leading experts who offered perspectives from their own research and professional experience on how the DSA systemic risk framework might look in practice, as well as its potential benefits and weaknesses. What follows briefly summarises and highlights the key takeaways of the conversation.
Panel 1: Putting the DSA Into Practice
The first panel examined the practical implications of the DSA’s entry into force and the roles of different stakeholders. A central question was the role civil society organisations (CSOs) and third-party researchers will play in the systemic risk framework. The conversation also explored the role of independent research and data access under Article 40 DSA [1] in informing risk management and addressing information asymmetries between platforms and other stakeholders, as well as the resources and funding that would be needed for meaningful expert participation and stakeholder engagement.
A key aspect of the discussion was the involvement of civil society and academia in developing standards and frameworks for risk assessment and mitigation. Participants noted that national- and EU-level regulators are eager to collaborate with civil society, and conversely, that CSOs are already actively contributing input. In particular, it was suggested that while audits [2] will likely focus on technical accuracy, rather than more fundamentally questioning platforms’ risk management approaches, academics and civil society can fill this gap with more critical independent scrutiny.
There was consensus that the Commission has adopted a proactive and assertive approach to overseeing risk management. This is illustrated by recent enforcement actions, such as the proceedings against Meta regarding downranking of political content and minor safety, as well as the TikTok Lite investigation regarding the launch of a new service without a prior risk assessment. The panel stressed the role of civil society in scrutinising the Commission’s enforcement strategy and in providing information to facilitate enforcement. Involving diverse independent stakeholders will be crucial in maintaining the credibility and legitimacy of the regulatory process. Participants highlighted the need to broaden the scope beyond traditional NGOs, for example to universities and journalistic organisations.
Participants also discussed challenges for CSOs in this field, such as the lack of transparent and consistent procedures for civil society input, and the fact that VLOPs’ risk assessments are still not public: this means CSOs cannot provide informed feedback on risk management approaches and have little idea of where to allocate their resources. Meanwhile, the Commission may be able to dominate the narrative in shaping how risks are understood and defined. Participants also discussed the shrinking civic space and growing legal repression of activism in certain member states, and the growing emphasis in Brussels on security and geopolitical issues over fundamental rights.
Another important topic was funding: the DSA relies heavily on civil society to contribute to and monitor its implementation, but where will the resources to do so come from? In this context, participants also placed particular emphasis on disparities in resources and capacities between organisations. This could mean that relevant perspectives and interests from CSOs other than established digital rights organisations are underrepresented. Some participants suggested strategies to address these issues and ensure more representative contributions, such as establishing more formalised participation structures, or targeted public funding to support diverse participation. In this context, participants also discussed the issues raised by such funding mechanisms, given the potential for conflicts of interest or compromised independence.
The discussion then turned to the role of transparency and data access in empowering stakeholders and facilitating evidence-based regulation. Participants questioned how easy it will be in practice for researchers to access data under Article 40, discussing examples of VLOPs imposing onerous conditions and interpreting Article 40 inconsistently. The data access provision has attracted significant attention as a new and innovative regulatory measure. However, considering its novelty, standardised application procedures are still lacking. To address these challenges, speakers suggested that the European Digital Media Observatory (EDMO) could act as an intermediary to facilitate access to data. Moreover, as well as the much-discussed Article 40, participants highlighted the - perhaps underestimated - importance of Article 39 on publicly-accessible ad archives, which could already in the short term provide researchers with easy access to a wealth of relevant data.
Panel 2: Understanding Risk Regulation
The second panel centred around the concept, terminology and framing of ‘systemic risks’, considering lessons that could be drawn from other regulatory fields where risk management features prominently, such as environmental and financial regulation. The conversation delved into the strengths and weaknesses of the DSA framework for risk assessment and mitigation, and offered further reflections on how civil society could contribute to its implementation.
One recurring topic was the origins and intentions of the terminology of ‘systemic risks’ used in the DSA. Participants noted that the exact meaning of ‘systemic’ remains unclear, which is legally consequential given its importance in defining the scope not only of Articles 34-35, but also, for example, Article 40 on research data access. Participants variously suggested that ‘systemic’ could refer to harms not related to individual items of content; to societal harms; or harms relating to platforms’ business models and design (which would be in line with the drafting history, as the term ‘systemic risks’ was introduced after proposed provisions targeting business models more directly were rejected).
Additionally, participants noted that another key goal of the systemic risk framework was to address concerns about content deemed ‘legal but harmful’, after early discussions about regulating such content directly (as in the UK OSA) were abandoned. Multiple participants expressed concerns that the systemic risk framework still gives regulators a lot of power to push for censorship of politically disfavoured content - discussing as an example recent demands from the Commission that VLOPs remove more Hamas-related content.
Another interesting topic of discussion was potential mechanisms to check this kind of political interference. While participants cited legal analyses suggesting that the principle of legality will restrict the Commission’s ability to mandate specific content policies, they also noted that interventions which do not directly target specific content (e.g. ID verification) can still have significant indirect impacts on freedom of expression. Moreover, the Commission can influence platforms’ policies not only through formal enforcement measures, but also through informal backroom discussions. In this context, the role of civil society was again emphasised. It was suggested that civil society pushback against politicised enforcement measures had empowered Commission staff to push for a more restrained approach. However, concerns were also raised that this episode might simply lead the Commission to be less transparent about its interactions with VLOPs in future.
There was also a lengthy discussion of the significance of framing policy issues in terms of ‘risks’. The DSA takes inspiration from financial and environmental regulation, where such risk management approaches are better-established, albeit with patchy records of success. Highlighting similarities and differences with these fields, participants suggested several issues with applying risk management techniques in platform regulation. First, platforms are complex interconnected sociotechnical systems, making it difficult to identify and address risks individually, without changing underlying platform architectures. Second, there is little consensus on what harms platforms might cause and how harms should be defined or measured. This makes it difficult to apply traditional risk assessment methods based on quantifying the probability and severity of harmful events, and especially difficult to establish comparable methods and metrics across different platforms and risk areas. Third, relatedly, many risks mentioned in Article 34 are essentially unquantifiable, but participants suggested that VLOPs and regulators might be incentivised to over-rely on quantitative metrics as a way to seem objective while depoliticising contested regulatory issues.
There was general consensus about the need for an ecosystemic approach that goes beyond listing specific risks. Some participants suggested that direct regulatory mandates might be a more appropriate way to mitigate known adverse impacts, while others suggested that risk management could be a useful approach, but should not be overly reliant on quantification techniques that give a false impression of objective technical solutions.
[1] Article 40 DSA allows ‘vetted researchers’ to access data from VLOPs to conduct research on systemic risks in the EU. This provision aims to enhance independent monitoring of platform actions and contribute to the detection, identification and understanding of systemic risks.
[2] Article 37 DSA requires VLOPs to commission yearly independent audits of their risk management measures.