—Katarzyna Łakomiec, Humboldt Research Fellow at the Max Planck Institute for Comparative Public Law and International Law, Assistant Professor at the Institute of Law Studies of the Polish Academy of Sciences, Mateusz Grochowski, Associate Professor at Tulane Law School, Affiliated Fellow at Yale Information Society Project


At what point do Orwellian metaphors become unproductive? One might ask this when observing recent discussions on the implementation of the EU Digital Services Act (DSA) in Poland.
Enacted in 2022, the DSA is the European Union’s main online platform regulation legislation. It addresses the broader societal risks created by platform design, algorithms, and content moderation systems, including threats to fundamental rights and consumer interests. The DSA’s enforcement system is a networked model in which Digital Services Coordinators (DSCs) are designated at the national level, while the European Commission plays a central role in supervising very large online platforms (VLOPs) and search engines (VLOSEs). The system relies on information-sharing, mutual assistance, and coordinated action across Member States.
Is entrusting the protection of citizens’ rights to private actors truly preferable to administrative oversight? The question arises naturally when following the recent debates over the DSA in Poland and it is worth reflecting on. Poland’s implementation struggle – remarkably unusual in comparison with other EU Member States – is not entirely a domestic matter. As we argue, it also raises broader, more troubling questions for the protection system the DSA aims to create.
After three long years, in December 2025, the Polish Parliament adopted a law implementing the DSA. Any sense of relief (measured, as usual, when balancing values is at stake) was, however, short-lived. The President promptly vetoed the bill, invoking the very Orwellian visions and parading them before the Polish public as genuine threats to freedom of expression.
1. Orwellian Fears and Digital Realities
The implementation of the DSA in Poland has been a bumpy road from the very beginning. Under the Law and Justice government (until late 2023), no decisive steps were taken, while the Civic Coalition government revised the draft several times, navigating long-standing sensitivities surrounding censorship. Finally, after a lengthy drafting process involving civil society actors, a compromise version was developed, incorporating a number of safeguards against administrative overreach. In the statement accompanying his veto, the President made repeated references to the Orwellian Ministry of Truth (Minitrue), while also voicing concern that oversight of the internet would involve a DSC, an administrative body, rather than an independent court. Regardless, the President’s position is problematic in another – much more substantive – respect, as it overlooks a key element of the analysis. Although he stresses freedom of expression and the constitutional requirement to use the least restrictive means under the “necessity test,” he ignores that the alternatives being compared must ensure a comparable level of effectiveness. The objections to the bill therefore appear one-sided and overlook the central question: how to ensure that digital market regulation effectively advances the DSA’s objectives while preserving the essence of fundamental rights, especially freedom of expression.
The President argues that courts are better suited than administrative authorities to protect freedom of expression, warning that empowering executive bodies risks politicising decisions on lawful content. Because judicial review does not, in his view, sufficiently mitigate this risk, determinations of lawfulness of speech moderation by a platform should remain with independent courts rather than administrative, order-based structures. Yet any meaningful comparison must also acknowledge that judicial protection – especially when it requires full evidentiary review and procedural guarantees – may be far slower than decisions by specialised administrative authorities supported by expert resources and a European supervisory network.
2. Speed vs. Legitimacy Conundrum
Beyond its immediate outcome (namely, the further postponement of the effective implementation of the DSA) the Polish DSA debate exposes a broader tension between speed and legitimacy in protecting rights online. The appropriate balance between these values is not fixed; it depends on the type of content involved, its significance for the author, the harm it may cause to third parties, and the manner of its dissemination. What the President’s motives for the veto fail to acknowledge, however, is that even brief circulation of certain content (revenge porn being a paradigmatic example) can inflict lasting, dignity-impairing harm on the person concerned. At the same time, the President seems to place considerable trust in large technology companies to self-police. In his view, administrative oversight is unnecessary because the DSA already imposes sufficient obligations through its notice-and-action mechanism.
Yet a model built solely on self-regulatory enforcement raises serious legitimacy concerns. The only authority platforms can invoke derives from adhesion contracts (based on standardized terms of service) with users. Nor is this authority persuasive from the standpoint of sovereignty and constitutional values: it effectively outsources the interpretation of European and national fundamental rights to private actors that were neither designed nor institutionally equipped to perform that evaluative role.
Content moderation systems do not promise to get every individual speech decision right; rather, they are designed to increase the probability that most decisions will be correct most of the time, and that when errors occur, the content moderators will tend to correct in accordance with pre-defined values and goals. This systemic logic persists even though each moderation act is framed as an individual determination concerning fundamental rights–often the rights of at least two persons: the speaker and the person exposed to or harmed by the content. In practice, the overall orientation of the system is shaped at the European level by the European Commission, but it must also be articulated and operationalised by DSCs at the national level.
3. Europe’s Romance with Administrative Constitutionalism
Internet regulation has moved far beyond command-and-control. The EU and its Member States increasingly outsource governance to private actors through co- and meta-regulation – on the premise that public authorities lack both the technical expertise and the resources to keep pace with digital markets. Platforms move faster and know their systems best.
But private delegation raises democratic legitimacy problems analogous to those discussed earlier. Platforms shape the scope of fundamental rights with little institutional accountability. At the same time, public oversight increasingly relies on a matrix of supervisory authorities operating across different regulatory regimes (including data protection, platform regulation, and consumer protection), of which DSCs form a key component. Across this matrix, the mandate of supervisory authorities is shifting from a narrow focus on compliance with detailed legal requirements to a broader role encompassing the protection of fundamental rights and other European values (even if not expressed through more specific legal rules).
Their legitimacy can be reinforced through independence guarantees, inter-agency cooperation, and judicial review. In classical theory, agencies are faithful agents of the legislature – nothing more. EU law complicates this: supervisory authorities emerge from overlapping Union and national processes and operate within regulatory networks marked by mutual influence. Yet Member States still retain meaningful procedural autonomy in designing their DSCs – a lever they have no equivalent of over platforms.
Inaction (i.e. the lack of implementing statute) is not a neutral choice. It either strips individuals of effective remedies or pushes other authorities – such as Poland’s Personal Data Protection Office, improvising on deepfakes – to stretch their mandate, where they could instead have cooperated with the DSC. Either way, legitimacy of platform regulation suffers.
The stakes go further. The DSA’s supervisory matrix depends on coordination: a missing or dysfunctional coordinator does not just create a local gap – it weakens the Commission’s ability to supervise VLOPs and VLOSEs across the entire system. Holes in the network unravel the whole.
4. The Missing Institutional Link
As things stand, Poland has no legal act implementing the DSA – and this situation is likely to persist for some time. Poland remains the only EU Member State yet to designate a DSC, and one of the very few without a functioning domestic enforcement framework.
What does this mean in practice? As in other areas of administrative law, competences cannot simply be presumed. No existing authority can automatically step into the DSC’s role without an express statutory basis. Until implementing legislation enters into force, the DSA cannot be effectively enforced in Poland: there is, quite simply, no authority empowered to perform its functions. While the Polish Ombudsman can address violations of fundamental rights, its competences vis-à-vis private actors are limited in scope.
In a perfect scenario, the DSC’s competences would likely be entrusted to one or several existing administrative authorities, building on their regulatory experience, while at the same time supporting their transition into constitutional actors dealing with matters of freedom of expression and the protection of individual rights and societal interests in the digital sphere.
This gap can be mitigated only to a limited extent through an expansive interpretation of provisions already in force under Polish law – particularly in the fields of consumer protection and personal data. One might imagine, for example, that Polish rules implementing the Unfair Commercial Practices Directive could be construed to more fully reflect the DSA’s understanding of dark patterns, or that general free-expression protections could be read in light of how the DSA frames freedom of expression. Such an approach would sit comfortably both with the DSA’s nature as an EU regulation – automatically part of the domestic legal order without further transposition – and with the broader principle of systemic, EU-friendly interpretation of national law.
Admittedly, however, these interpretive manoeuvres can only go so far. They may produce modest course corrections at the margins, but they cannot deliver the structural transformation of the digital market’s governing logic that the DSA was designed to achieve.
In January 2026, the Polish government published a new bill implementing the DSA that establishes a national procedure for issuing orders to disable access to illegal online content – particularly content linked to serious criminal offences – while introducing procedural safeguards, including the right to judicial review, the absence of immediate enforceability, and mechanisms to restore access where content has been wrongly blocked. The future of this law, however, remains to be seen.
5. From National Delay to European Dysfunction
As a result, the DSA remains, for now, almost entirely suspended in Poland – formally binding yet institutionally disarmed. The DSA was designed around a network of DSCs bound together by shared obligations of mutual assistance, cross-border enforcement, and joint risk monitoring. This architecture presupposes that every node in the network is present and operational. Poland’s absence does not simply leave a blank space – it disrupts the cooperative logic on which the entire system depends. Without a functioning DSC, Poland cannot participate in coordinated response mechanisms, contribute to collective digital market oversight, or serve as a meaningful partner for the Commission.
Though it might be tempting to frame this legislative standstill merely as a pause, this framing is mistaken. Inaction redistributes costs invisibly. It either strips individuals of effective remedies or pushes other authorities into improvisation beyond their statutory mandates. The Polish data protection authority’s engagement with deepfake cases is a symptom of this dynamic – a gap filled not by design, but by necessity, and at the cost of institutional coherence. The gaps in the architecture do not stay local: they propagate upward, weakening risk monitoring across the digital services market, disrupting cooperation mechanisms the system was built around, and ultimately reducing the Commission’s capacity to supervise VLOPs and VLOSEs – thereby undermining the EU’s broader ambition to govern the digital space on its own constitutional terms.
Suggested citation: Katarzyna Łakomiec and Mateusz Grochowski, Too Much Time on Minitrue: Implementing the Digital Services Act in Poland, Int’l J. Const. L. Blog, May 7, 2026, http://www.iconnectblog.com/too-much-time-on-minitrue-implementing-the-digital-services-act-in-poland/