Decoding the DSA, one project at a time

Our Work

  • Webinar Recording: Promoting Safer Platform Design Through DSA Codes of ConductNew List Item

    Facilitated by Rachel Griffin

  • DSA Decoding in Action : DSA Ad Repository Data Sprint @ Sciences Po

    Under the Digital Services Act (DSA), Very Large Online Platforms (VLOPs) like Facebook, TikTok, and Google are required to establish a Public Advertisement Repository with details of every paid ad (including text and images), information about the advertiser, and data on the ad's target audience to ensure transparency.

    One year following the DSA's implementation, we aim to assess the effectiveness of these ad repositories. through a data sprint, inviting collaboration among academics, regulators, journalists, and activists.

    This initiative represents a significant step towards scrutinizing and understanding the impact of online advertising practices and the role of transparency in mitigating associated risks.

  • Our Response to the EU Call for the delegated draft on data access under the DSA

    The Delegated Regulation on data access provided for in the Digital Services Act aims to implement a framework for vetted researchers’ access to data from very large online platforms and very large search engines to increase platforms’ transparency and accountability.

  • What do we talk about when we talk about risk? Risk politics in the EU’s Digital Services Act

    by Rachel Griffin

    What are the implications of framing normative and political questions about platform governance in terms of ‘risks’ to be managed through technocratic expertise? This article suggests that the DSA’s system of risk management obligations for the largest platforms ignores the essentially political and contestable nature of risk, and will only reinforce the power of corporate and state actors to determine what kinds of harms will be recognised and addressed.

  • Systemic Risks in the DSA and its Enforcement

    by Beatriz Botero Arcila

    A brief summary of the enforcement practices active to date in order to understand how the Commission is understanding the DSA’s systemic risks framework.

  • Takeaways from the webinar ‘Delimiting Systemic Risks in the DSA’

    Summary of the topics discussed during the webinar on “Delimitating Systemic Risks in the DSA” with researchers, regulators and civil societies.

  • From Generative AI to General Elections: The Risks and Realities of GAI

    by Francesca Elli

    This article analyses the DSA’s Electoral Guidelines in relation to the potential impact of generative artificial intelligence on electoral integrity.

  • An early win for the transparency measures of the DSA. A comment on Amazon Services v. European Commission (C-638/23)

    by Beatriz Botero Arcila

    This blog post provides a review of the European Court of Justice’s decision to reject Amazon’s plea to suspend compliance with Article 39 of the DSA. It includes an analysis of potential implications for the future enforcement of the DSA.

  • Social media platforms and challenges for democracy, rule of law and fundamental rights

    by Beatriz Botero Arcila and Rachel Griffin

    This study, commissioned by the European Parliament’s Policy Department for Citizens’ Rights and Constitutional Affairs at the request of the LIBE Committee, examines risks that contemporary social media - focusing in particular on the most widely-used platforms - present for democracy, the rule of law and fundamental rights. The study focuses on the governance of online content, provides an assessment of existing EU law and industry practices which address these risks, and evaluates potential opportunities and risks to fundamental rights and other democratic values.

  • Is it a Platform? Is it a Search Engine? It's Chat GPT! The European Liability Regime for Large Language Models

    by Beatriz Botero Arcila

    This Essay examines how to regulate LLMs so that risks are mitigated while still encouraging innovation and allowing their benefits to be realized, with a focus on the liability regime for LLMs for speech and informational harms and risks in the EU. An interpretation of the newly enacted Digital Services Act is proposed, so that it could apply to these tools when they are released in the market similarly to other intermediaries covered by content moderation laws, such as search engines.

  • Climate Breakdown as a Systemic Risk in the Digital Services Act

    by Rachel Griffin

    This policy brief offers a legal analysis of the DSA’s relevance to environmental policy and explains why environmental risks are within its scope. It then outlines appropriate measures to mitigate platforms’ direct and indirect environmental impacts. It concludes with recommendations for platform companies, regulators, and civil society on how to realise the DSA?s potential to help secure a more sustainable tech industry.

  • Quelles sont les données nécessaires pour évaluer et surveiller les risques systémiques des plateformes liés à la polarisation politique et à la désinformation politique

    by Beatrix Botero Arcila and Pedro Ramaciotti Morales

    Talk discussing what data is needed to assess and monitor systemic platform risks related to political polarization and political disinformation.

  • Le Règlement sur les Services Numériques et l’urgence climatique : atténuation des risques, accès aux données pour la recherche et pour répondre à la crise

    by Rachel Griffin and Ilaria Buri

    Talk about the Digital Services Regulation and the climate emergency, focusing on risk mitigation, access to data for research and to respond to the crisis.

  • DSA Decoded: Key Questions and Answers

    By Francesca Elli and Emma Cabale

    The Digital Services Act (DSA), a groundbreaking regulation introduced by the European Union and enforced in 2024, aims to foster a safer and more reliable online environment and enacts comprehensive rules for online intermediaries and platforms. As policymakers develop and implement the DSA, they are confronted by key questions. We have identified and answered some of them