The Digital Omnibus Won’t Work Without Enforcement Reform
Now that I’m emerging from a slower period of paternity leave, you can expect more regular programming. Today, I’m sharing an edited excerpt from my latest comments on the planned GDPR reform.
by Gemini / Nano Banana 2
The European Commission’s Digital Omnibus proposal is, on its face, the most significant attempt to fix the GDPR since the regulation entered into force. It would clarify the definition of personal data and confirm that legitimate interest can support AI training (while failing to do anything about the cookie consent mess, alas). Today, ICLE submitted formal comments, in which I argue that the co-legislators should adopt most of the changes. If you are interested in my responses to the EDPB/EDPS Joint Opinion or the NOYB report, you will find them in the full comments.
Here, I want to focus on the argument that most commentators - and the Commission itself - are avoiding. The changes proposed in the Digital Omnibus are welcome, but they are far from being sufficient. The same authorities that produced the current dysfunction will interpret the new rules. And unless the enforcement architecture changes, textual clarification will deliver only a fraction of the promised benefit. The Commission acknowledges that “more consistent and harmonised interpretation and enforcement across Member States” is needed. However, it provides no mechanism to achieve it.
Privacy myopia
Data protection authorities across the EU are, structurally, single-issue campaigners. This is not a criticism of the individuals who staff them - they are, for the most part, competent professionals who believe (correctly) that data protection matters. The problem lies in institutional design. DPAs have immense power to impose fines of hundreds of millions of euros. But they face no robustly enforced obligation to consider the consequences of their actions for interests other than privacy.
The result is what I have called “privacy myopia“: a regulatory culture in which maximising data protection is treated as the sole objective, and any consideration of competing interests - innovation, economic security, freedom of expression, public health - is treated as someone else’s problem. No countervailing institutional voice represents those interests within the enforcement process. Competition authorities occasionally intervene, but their mandate is narrower than the full range of what is at stake. Political authorities, whose responsibilities span economic security and public welfare, are excluded because their involvement would supposedly violate the independence requirement of Article 52 GDPR.
This has consequences: consider, for instance, the EDPB’s Opinion 28/2024 on AI models. It provides a lengthy list of compliance measures while offering no guarantee that following them will satisfy enforcement authorities. It grants regulators near-limitless discretion while providing minimal practical guidance to those trying to innovate responsibly. Businesses who invest in GDPR compliance for AI training have no assurance their efforts will be judged sufficient in a year or two. As I argued at the time, the uncertainty is as chilling as an outright prohibition - and for any entrepreneur aware of this risk (to be fair, many aren’t) and with a choice of jurisdiction, the rational response is to avoid the EU entirely.
The EDPB lacks accountability
In theory, the EDPB issues non-binding guidelines and opinions. In practice, these instruments function as the law in action: they shape DPA enforcement priorities, drive market behaviour, and are cited by national courts as authoritative. Controllers that deviate from EDPB guidance face the realistic prospect of enforcement action, regardless of whether the guidance accurately reflects the GDPR’s text or the CJEU’s case law.
The EDPB’s guidelines on Article 5(3) of the ePrivacy Directive illustrate the mechanism. The guidelines dramatically expanded what requires prior consent - treating generic URL parameters and standard browser transmissions as “access to information stored on the terminal equipment.” This reversed the narrower interpretation previously adopted by several national authorities (including the German Datenschutzkonferenz). The expansion occurred through guidelines, not legislation. It was not subject to impact assessment, parliamentary oversight, or robust stakeholder consultation of the kind required for legislative or even delegated acts (the consultation that took place served as a window-dressing exercise). The result was a reshaping of the regulatory environment for every website operator in the EU - through a nominally non-binding instrument that no one can effectively challenge in court. (I explored the consequences in “Why Europe Can’t Kill the Cookie Banner.”)
The EDPB/EDPS Joint Opinion 2/2026 on the Digital Omnibus reveals the same institutional logic. The EDPB argues that its own forthcoming guidance on pseudonymisation is a “better vehicle” than the Commission’s proposed Article 41a implementing acts. The pattern is consistent: wherever the Commission proposes implementing acts or legislative clarification, the EDPB argues it should have exclusive or primary control instead. In other words, the enforcer wants to keep setting the scope of the law it enforces.
Can the courts fix this?
Three recent sets of proceedings involving Meta and WhatsApp expose why the answer is: not yet, and not without help.
First, in WhatsApp Ireland v EDPB (Case C-97/23 P), the CJEU Grand Chamber held in February 2026 that EDPB binding decisions under Article 65 GDPR are “acts open to challenge” under Article 263 TFEU. This was a significant ruling. But the case was filed in 2021 and the admissibility ruling took five years. The merits remain pending. In this case, the EDPB had directed the Irish DPC to increase its proposed fine against WhatsApp from approximately EUR 30 million to EUR 225 million - a 7.5-fold increase. Even with access to the courts, meaningful review takes five or more years.
Second, in Meta v EDPB (Case T-319/24), the General Court held that the EDPB’s Opinion 8/2024 on “consent or pay” models “does not produce binding legal effects vis-à-vis third parties” and was therefore not an act open to challenge. The Court acknowledged the opinion used mandatory language (”should,” “should not”) but characterised this as “calling for an in-depth consideration” rather than imposing obligations. Meta’s appeal is pending. Even if Meta prevails, the ruling would address only Article 64(2) opinions — not the broader category of EDPB guidelines and recommendations that constitute the bulk of its quasi-legislative output. (I discussed the implications in “Meta v EDPB and What Really Needs to Change in the GDPR.”)
Third, in DPC v EDPB (Joined Cases T-578/21, T-579/21 and T-580/21), the General Court upheld the EDPB’s power to direct national authorities to broaden their investigations into areas the national authority had not originally examined - applying no meaningful scrutiny of whether the EDPB’s substantive positions were balanced.
The three cases together expose the accountability gap. The EDPB’s most influential outputs - guidelines, opinions, recommendations - are the instruments most shielded from judicial review. Its binding decisions are now challengeable, but only after years of litigation and without guidance on what standard of substantive scrutiny courts should apply. The courts have treated the EDPB as a neutral authority that considers the general public interest. Whether that characterisation is accurate is, at best, contestable.
What the Joint Opinion does not say
The EDPB/EDPS Joint Opinion on the Digital Omnibus runs to 47 pages. It contains no reflection on whether the enforcement architecture is adequate. It mentions DPA resource constraints only in the context of requesting more resources. It never acknowledges the structural problems that lead to inconsistent enforcement. It never discusses the EDPB’s own accountability or the quasi-legislative nature of its opinions and guidelines. It never engages with the argument that separating investigation from adjudication could improve the quality and legitimacy of enforcement decisions.
This silence is telling. A regulatory body invited to comment on a comprehensive reform of its own legal framework devotes no space to asking whether its own institutional design might contribute to the problems the reform is designed to solve. I am not suggesting the EDPB is acting in bad faith. The more plausible inference is that the institutional incentives run against self-examination. As Mark Leiser puts it, the Joint Opinion is “a regulatory counteroffensive” that “deploys the language of rights not primarily to protect individuals, but to reassert institutional centrality, supervisory reach, and interpretive authority.” The observation is sharp. But the incentive structure produces the behaviour, whether or not any individual regulator intends it.
Is proportionate enforcement possible?
The answer is yes - we have examples.
The CNIL has demonstrated that pragmatic enforcement is achievable within the existing framework. Its AI guidance acknowledges that GDPR compliance for AI training is possible and provides workable compliance pathways — a marked contrast with the EDPB’s strategically ambiguous Opinion 28/2024. But one national authority cannot shift EU-wide practice. The CNIL’s approach faces opposition among more absolutist privacy officials and activists, and there is no prospect that privacy enforcers will voluntarily adopt it across the EU. The structural incentives run the other way: DPAs that interpret the law broadly expand their own regulatory reach and receive institutional validation from the EDPB’s consistency mechanism. A DPA that interprets proportionately risks being overridden by the EDPB - as the Irish DPC’s experience demonstrates.
The UK provides an even more instructive example. The Information Commissioner’s Office announced in July 2025 a risk-based enforcement approach for online advertising, identifying six advertising capability categories where low-risk activities could proceed without consent. The ICO is doing something EU DPAs seem structurally incapable of (at least when acting as the EDPB): exercising enforcement discretion to achieve proportionate outcomes, distinguishing between extensive behavioural profiling (which requires consent) and low-risk activities like frequency capping and ad delivery (which may not).
Moreover, this is not just a single enforcement initiative. Section 91 of the Data (Use and Access) Act 2025 rewrites the Information Commissioner’s statutory duties. It inserts a principal objective requiring the Commissioner to secure an appropriate level of protection for personal data “having regard to the interests of data subjects, controllers and others and matters of general public interest.” It imposes explicit duties to consider innovation, competition, crime prevention, and public security. Section 120C requires a strategy framed accordingly, and section 120D requires consultation with other regulators, backed by annual reporting to Parliament.
This shows that data-protection enforcement need not be structured as a one-dimensional privacy mandate. A legislature can preserve regulatory independence while imposing explicit duties of balance, strategy, consultation, and public accountability.
What should the EU do?
I have proposed a five-part structural reform. It does not question the importance of data protection. It targets the institutional design that converts competent professionals into single-issue enforcers.
First, separate investigation from adjudication. DPAs would retain their investigative powers - the resources, technical expertise, and local knowledge they have built over years. But for consequential cross-border cases, they would present their findings and enforcement recommendations to an independent decision-making body, rather than serving as both prosecutor and judge.
Second, create an independent multidisciplinary EU tribunal. The tribunal would issue binding decisions, including fines, for consequential cross-border enforcement actions. Critically, it would not be composed exclusively of data protection specialists. A diverse panel - including economists, generalist judges, and experts in the regulated sectors - would weigh data protection against other fundamental rights and Europe’s broader public-interest objectives.
Third, mandate formal balancing of all interests. The tribunal’s legal framework would require it to articulate, explicitly, in each decision, how it balances data protection against other fundamental rights and the economic realities of the regulated activity. The GDPR already requires this balancing (Article 6(1)(f), Recital 4). The current enforcement architecture provides no institutional mechanism to ensure it occurs.
Fourth, appoint advocates general for non-data-protection interests. Drawing on the CJEU model, the tribunal would include judicial officers specifically tasked with highlighting and defending non-data-protection interests - innovation, freedom of expression, economic security, public health - that are affected by enforcement decisions but have no institutional voice under the current system.
Fifth, require tribunal review of EDPB outputs. The tribunal would review EDPB guidelines, opinions, and recommendations before they become final. If the EDPB produces guidance that fails the proportionality standard - as in its Article 5(3) ePrivacy guidelines, which effectively require website operators to obtain consent from would-be fraudsters before deploying anti-fraud measures - the tribunal could refuse to approve it.
This does not require Treaty change. Article 257 TFEU already empowers the European Parliament and Council to establish “specialised courts attached to the General Court.” Garicano, Holmström, and Petit’s Constitution of Innovation proposes using exactly this Treaty basis to create specialised commercial courts for internal market enforcement - regional tribunals with fast-track procedures, English-language proceedings, and EU-wide injunctive powers. Their courts are designed for mutual-recognition and trade-barrier disputes, but the institutional architecture is directly transferable. A specialised EU regulatory tribunal under Article 257 TFEU could house both the commercial jurisdiction they envision and the data protection enforcement function I propose. It would also embed data protection enforcement within a broader framework of EU regulatory adjudication, ensuring that privacy is weighed alongside the other interests that the EU’s regulatory instruments affect.
The Unified Patent Court already provides a working precedent for specialised EU judicial bodies in complex technical and economic matters. The real issue is whether the political will exists to apply the same institutional logic to data protection - where the current enforcement architecture serves the institutional interests of the bodies that would need to cede authority.
Who else sees the problem?
I am not alone. Luca Bolognini and Enrico Capparelli propose that supervisory authorities should be required to carry out a prior analytical assessment of the effect that their decisions would have on innovation, competitiveness, and other fundamental rights - and that this assessment should be notified to the addressee with the provisional decision. They also propose mandatory participatory prior-consultation procedures for EDPB guidelines, with public availability of consultation results.
Peter Craddock observes that DPA opposition to the entity-relative definition of personal data cannot be treated as neutral, because when information ceases to be perceived as personal data, it falls outside the authorities’ remit: “one must be careful not to see their position as neutral.”
What happens if nothing changes?
If the enforcement architecture remains unchanged, the substantive reforms in the Digital Omnibus will be interpreted by the same authorities, using the same maximalist lens, that produced the problems the reforms are designed to solve. The entity-relative definition of personal data will be narrowed through guidance and enforcement practice. The legitimate interest framework for AI training will be hedged with conditions that recreate the uncertainty it was meant to resolve. The ePrivacy exemptions will be read as narrowly as the exemptions they replace.
The Commission should know this. Its own Explanatory Memorandum acknowledges the enforcement consistency problem. But the Digital Omnibus contains no mechanism to address it. The co-legislators have an opportunity - arguably their best opportunity in a decade - to fix not just the text but the institutions that apply it. The question is whether they will take it, or whether they will settle for a reform that looks good on paper and changes less than it promises.
