An early win for the transparency measures of the DSA.
A comment on Amazon Services v. European Commission (C-638/23)
by Beatriz Botero Arcila
Early in April, the European Court of Justice delivered an important decision regarding the implementation of the Digital Services Act (DSA) [1]. Affirming the DSA's objective of enhancing transparency and accountability for major platforms, the Court rejected Amazon’s request for interim measures to suspend its compliance with Article 39 - the obligation to compile and publish information about their advertising services. This is an important decision because it suggests that the Court will give significant weight to the public interest pursued by the DSA in its enforcement and, in general, to uphold its data-sharing and transparency provisions, a cornerstone of the regulation.
At the same time, it is not a definitive decision, and there are still some ways in which the Court could eventually limit the transparency provisions in the future. The Court does not seem to believe that the EU legislature has an unlimited power to mandate the disclosure to the public of confidential information and the Commission still must prove that those provisions are necessary to meet the public interest objectives pursued by the DSA.
In this blog post, I review the decision and comment on its importance for the enforcement of the DSA. To do so I proceed in three parts. The first offers some background to the DSA obligations at issue, their importance, and the procedure, the second summarizes the decision, and the final part discusses the Court’s reasoning and offers some thoughts on ways to look ahead in the main procedure and what this may mean for the enforcement of the DSA. (Readers familiar with the DSA or that may care less about the legal details, should feel free to skim.)
Background
Amazon Services v. European Commission (C-639/23) is the final decision on an interim measures procedure where Amazon had requested the suspension of its obligations to comply with Arts. 38 and 39 of the DSA. Articles 38 and 39 require VLOPs to introduce an opt-out option for recommender systems and to make available in an advertisement repository information about its advertising practices. Interim measures are a parallel procedure to a main proceeding in which a legal act is contested. In the interim measure procedure, plaintiffs request the temporary suspension of the contested act because the operation of the act threatens to harm them urgently and irreparably. In this case, the main procedure is Amazon’s challenge to its designation as a Very Large Online Platform. Amazon claimed that complying with Articles 38 and 39 would make it lose its competitive advantage in comparison with other non VLOPs marketplaces and would make public confidential information. (Amazon competes, in fact, closely with Google and Meta as one of the largest online advertisers and dominates the retail advertising industry).
The decision discussed here is the second instance of the procedure. In the first instance, the President of the General Court found that Amazon failed to prove that it would suffer an irreparable loss from complying with the obligation to offer an opt-out version of its personalized recommendation system. On the second charge, however, the President found that Article 39 was requiring Amazon to publish information that was likely to be confidential, making it available for its competitors. The President reasoned this was likely to cause Amazon serious and irreparable harm and the request was granted.
That initial decision seemed to deal a significant blow to the enforcement of the DSA, particularly because it is still in its early stages. These two obligations are central to the DSA’s efforts to address risks associated with social media and platforms. Specifically, transparency obligations are central to ensuring an adequate level of accountability for VLOPs and VLSEs due to the importance and reach of these services. Until now, platforms have often raised privacy, trade secrecy, and security arguments to restrict access to information needed by regulators, researchers and the public at large to understand how they work and what their effects on society are. The focus on tracking-based advertising seeks to shed light on a main driver of platforms’ risks because it works with personalized recommender systems to use data to personalize content and increase engagement. This optimization engagement, however, may be related to phenomena such as the amplification of mis - and disinformation, political polarization, and kids' (and adults’) decaying mental health
Fortunately, the European Commission appealed, and the ECJ set aside the first instance decision to suspend the operation of Article 39 DSA.
The ECJ’s Decision
According to the ECJ’s case law and regulation, interim measures are only granted when three requirements are met: (1) the action in the main proceedings must appear to have reasonable substance (also known as the prima facie requirement); (2) the applicant must show that the measures are urgent and that without them it would suffer serious and irreparable harm; (3) that the decisions weight the parties’ interest and the public interest. Here, I briefly summarize how the Court reviewed each of those in what follows:
First, and while examining the prima facie case, the Court found that it is indeed at least likely that Article 39 requires Amazon to publish information that is confidential for the reasons considered in the first instance. It recognized, however, that it is possible, too, that the information at issue is not confidential. Indeed, the Commission argued that much of the information requested is information that must be reported already under other regulations to which Amazon is bound [2], and that some of that information can be obtained in the market for advertising metrics and data analytics. Even if the substance of the issue is for the court of the main proceeding to decide, the Court found that Amazon’s claim is reasonable, and the prima facie case requirement was met.
Second, the Court also found that Amazon had successfully demonstrated that the alleged damage, should it occur, was irreparable and serious. Thus, the urgency requirement was met. The key argument here was that the potential damage of disclosing confidential information, assuming it is confidential, is serious and irreparable: The damage is serious because advertisers have “an interest in being able to implement advertising practices which cannot be easily reproduced by their competitors.” Third-party sellers could become reluctant to publish advertisements on Amazon Store because the information at issue would be publicly available and Amazon’s competitors would have access to knowledge of strategies that they could then implement to improve their competitive position.
The damage was also considered to be irreparable because Amazon was unlikely to be compensated for it via a subsequent action for damages should damage occur - and should Amazon succeed in the main proceeding. The Court noted that it was hard or impossible to estimate the damage as the effects of publishing that information would depend on who acquired that knowledge - anyone, basically -, that it would be impossible to “assess the consequences that the publication of that information might have on Amazon’s commercial and financial interests.” Indeed, one of the hard things about information risks and damages in general is that, because once information is published it cannot be controlled anyone can “see” it and use it - this is known as “the loss of control” problem.
Lastly, came the balance of the private and the public interest at issue. The Court found that despite all the above, the public interest pursued by the DSA had to prevail over Amazon’s material interests. The Court seemed to estimate that Amazon’s potential harm was, even if irreparable and urgent, not significantly sizable. The Court noticed that the repositories required by Article 39 must be updated regularly. Thus, even if in practice the information published there would be deprived of its confidential nature, Amazon would still be able to update its advertising activities and attract advertisers, should it prevail on the main procedure. Additionally, the harm would not jeopardize Amazon’s existence and long-term development, as only 7% of Amazon’s total revenue comes from its advertising activities.
This “deluded” assessment of the damage was then weighed against the public interest sought by the DSA in general and more specifically by Article 39. The Court explained that the DSA pursues important public interest objectives, mainly addressing the societal risks created by social media platforms and their business model. The obligation to subject VLOP’s advertising practices to extra scrutiny and transparency measures is a central part of that effort.
Consequently, the Court found that, whereas not granting the request to Amazon would only have a limited effect on the company, it would, however, have an important impact on the implementation of the DSA. It would lead to a delay of potentially several years of those objectives, would allow the risks of an online environment that threatens fundamental rights to develop further and would create an unequal market for all the VLOPs and VLSEs who did publish their repositories, raising issues of unequal treatment.
Amazon’s request was dismissed.
Analysis
Amazon Services v. European Commission (C-639/23) touches upon a central element of the DSA’s ambition and effort: to subject the largest platforms to wider scrutiny and accountability by giving policy-makers, researchers, and the public at large better access to information about what they do and how they work. This is an early and decisive win for the enforcement of the DSA, and for the transparency and data-sharing provisions. It also seems to set the tone for how the Court may decide similar cases, including the main proceeding.
At the same time, legal cases aren’t over until they are over, and there is reason to be cautious about how broadly the Court may interpret the transparency obligations if Amazon, or any other platform, establishes that the information at issue is, indeed, confidential. To make this point I make four brief points on what seem to have been the main drivers of the Court’s decision, how the balance of trade secrecy and transparency may play out in future decisions or may be evaluated, in the main proceeding and, in general, in the interpretation of the DSA, and what may be the worst-case scenario (it’s not so bad):
First, it’s important to highlight that in the decision the Court gave significant weight to the importance, and the urgency, of the implementation process of the DSA: The Court defers – and seems to agree – with the EU’s legislature assertion that the DSA is a central regulatory piece of the digital agenda that seeks not only to ensure the proper functioning of the internal market but, crucially, “to ensure a safe, predictable and trusted online environment in which the fundamental rights enshrined in the Charter are duly protected.” The Court, additionally, seems to also believe that taking action to do so is urgent and that the delay that would come with not applying those obligations, potentially for a few years, is unacceptable given the threat that the current shape and development of the information environment represents to fundamental rights. (This is curious, from a legal perspective, because the Court itself says that the Commission neither claimed nor demonstrated that granting Amazon’s request for interim measures would impede the eventual achievement of the DSA’s objectives). Thus, a first conclusion may well be that the Court sees the DSA as central, in general, to the guarantee of fundamental rights in the EU and that preliminary it is inclined to weigh that public interest heavily against other interests.
Here, however, the small size of the damage and the potential real effect of the damage at issue weighed, unequivocally, the balance in favour of the Commission and the enforcement of the DSA (Amazon has, in fact, already published the repository).
Second, it is also important to recognize, as the Court did, that data openness comes with risks to individuals and organizations. Not without any merit, it is a commonly held belief that transparency obligations in general are limited, or must at least be balanced, by private interests and rights such as data protection and trade secrets. (I have written about the balancing of data protection rights and transparency. Spoiler alert: data protection has been winning lately.) The case for trade secrecy interests, however, may be slightly different: Article 1(2)(b) of the EU Trade Secret Directive (‘TSD’) establishes that the TSD shall not “the application of Union or national rules requiring trade secret holders to disclose, for reasons of public interest, information, including trade secrets, to the public or administrative or judicial authorities for the performance of the duties of those authorities.” Mandating the disclosure of confidential information in the public interest is, thus, lawful under EU trade secrecy law.
Third, and despite of the above, the Court does not seem to believe that the EU legislature has unlimited power to mandate the disclosure of confidential information. The Court didn’t go as far as to assert that the TSD will “take a step back” to give way to the transparency obligations of the DSA. The Court recognized that platforms have a legitimate and protected interest in their trade secrets, even if it wasn’t established that the information at issue was, or was not, confidential, and it recognized that Article 39 may eventually limit protected legal interests that platforms have on that information. (“(…) the judge hearing the application for interim measures cannot find that it has been established, with sufficient evidence (…) that the application of that Article 39 to Amazon would not result in a limitation of the rights that it may derive from Articles 7 and 16 of the Charter.” See num. 106 – recall that Art. 7 of the Charter refers to private life, and Art. 16 to the freedom to conduct business.)
Rights and interests, however, are also rarely absolute. According to Article 52(1) of the EU Charter of Fundamental Rights, they can be limited so long as the limitation is provided for by law, respects the essence of the right and freedom, and subject to the principle of proportionality, it is necessary and genuinely meet objectives of general interest at issue. Here, mainly, the limitation is clearly established by law (the DSA), and they are intended to meet objectives of general interest (platform transparency and accountability. But the Commission may still need to show that subject to the principle of proportionality a broad interpretation of the transparency obligations, that allows for the publication of potentially confidential information – which will then cease to be confidential - is necessary to meet the objectives of the DSA. (“Such a limitation of those rights would, however, be such as to establish the illegality of Article 39 of Regulation 2022/2065 only if that limitation did not comply with the conditions set in Article 52(1) of the Charter.” See num. 107.) [3]
Fourth, there is most likely no need to despair. It seems unlikely that the Court is really considering declaring the illegality of Article 39 or any of the other transparency obligations. At most, the Court may be willing to who can access the information that platforms show is confidential, limiting, for example, the disclosure of confidential information to regulators (and potentially researchers and/or certain civil society organizations) but not to the public at large. (That platforms manage to show that something is confidential information may still be challenging, as trade secret protections are hard to prove and are supposed to be construed narrowly). Indeed, the Court somewhat cryptically notes that “the equivalent nature, for the application of Articles 7 and 16 of the Charter, of disclosure of information to the user concerned alone or to the public as a whole is a largely new and somewhat complex issue.” (see num. 102). The Court may be suggesting that there may be a difference between disclosing information to certain actors and making it publicly available, which may have repercussions in an eventual proportionality analysis.
In any case, there are more reasons to be optimistic and, somewhat on purpose, I am trying to err on the side of caution. The Court signaled it committed to the objectives of the implementation of the DSA, and that it sees the DSA as an important regulation to guarantee fundamental rights in the EU information environment. This is, in general, a major win. The early and interim win will, in the meantime, give the Commission the time and space to build its enforcement infrastructure and show, but also learn, how the transparency measures support its objectives and what information really needs to be public and what could or has to be shared in more limited ways. The fact is that there is so much we don’t know about social media and platforms, including what information we need to audit them. This decision grants the Commission (and the civil society and research community) some time to learn just that.
[1] The DSA established a new comprehensive regulatory framework for online platforms in the EU which reserves the most extensive obligations for ‘very large online platforms’ (VLOPs) and ‘very large search engines’ (VLSE’s) defined as those with over 45 million users. Designated VLOPs must comply with a series of obligations such as publishing regular transparency reports, conducting risk assessments, making some data available for researchers upon request and, most relevant for the case at issue, offer the opt-out option of recommender systems (Art. 38) and to compile and make available key information on its advertising practices (Art. 39).
[2] In particular, the Commission mentioned the Unfair Commercial Practices Directive (Directive 2005/29), the GDPR and Regulation 2019/1150 on fairness and transparency for business users of online intermediation services.
[3] To the best of my knowledge there is little recent case law that can elucidate how the Court may balance platform transparency with the private interests of platforms in confidential information. Recent case law balances public transparency with private interests in confidential information, focus on the transparency obligations of European institutions, and even then, the case law is not concluding PTC Therapeutics International v EMA, the Court rules in favor or access to documentation where the European Medical Information had redacted some confidential information, but the plaintiff claimed that all documentation should have kept secret. In Breyer v. Rea, however, decided in 2023, a case regarding public access to documents concerning an EU-funded emotion recognition project for border use, the CJEU ruled that the harm to the commercial interests outweighed the transparency interest related to the information of the early stages of the research project. A key difference with these cases and what is at issue with the DSA is that in both PTC Therapeutics and Breyer the information at issue was in the hands of a European body which is subject to stricter transparency requirements (see WM and Sovim).