The European Commission now has direct supervisory powers over major online platforms, and the goal of the EU’s new digital legislation is not to impose fines but to ensure compliance with European rules, Bulgarian MEP Eva Maydell of the European People’s Party (GERB-UDF), co-rapporteur of the Artificial Intelligence Act (AI Act), told BTA in an interview.

The interview was conducted as part of an initiative supported by the European Parliament and dedicated to the implementation of European legislation in Bulgaria. In the interview, Maydell commented on the European Commission’s readiness to exercise effective oversight over major technology companies, the role of national institutions, restrictions on targeted and personalized advertising, and the place of the new regulations in the broader debate on the EU’s technological sovereignty.

Q: Ms Maydell, to what extent is the European Commission ready to exercise effective oversight over giants such as Meta, Google and TikTok, and how will Bulgaria take part in this process?

A: The European Commission is no longer merely an observer. With the entry into force of the Digital Services Act (DSA), the Digital Markets Act (DMA) and the Artificial Intelligence Act (AI Act), it now has direct powers with regard to companies such as Meta, Google, TikTok and Apple. For me, however, it is extremely important that procedures for imposing fines are not perceived as “money collection” from major technology companies. In Europe, we do not use fines as a form of tariffs or taxes – they are a tool to encourage platforms to comply with our rules.

As far as Bulgaria is concerned, national institutions such as the Communications Regulation Commission, the Council for Electronic Media and the Commission for the Protection of Competition have a role in gathering information and alerting the European Commission to alleged violations. For its part, the Commission has sufficient instruments to carry out inspections and engage in various procedures with the platforms.

Q: The Digital Services Act bans targeted advertising to children and the use of sensitive personal data. Do you expect these restrictions to change the platforms’ business models? What safeguards exist against political pressure through the application of the Digital Services Act?

A: This is nothing new for Europe. European citizens expect Member States and the European Union to protect their rights and provide them with security. If a platform wants to operate on a market of over 500 million users, it must comply with these expectations.

In the Bulgarian context, we can recall the beginning of the Russian invasion of Ukraine, when there were cases of content critical of Russia being removed, which triggered significant public discontent. Together with my team, we held a number of meetings with representatives of Meta, and as a result the company organized events in Sofia to explain how its content moderation mechanisms function.

The reality is that market scale matters for large platforms. The Bulgarian market is relatively small, which is why it is not subject to a special individual approach but rather to the common European model. This is precisely the guarantee that neither current nor future governments or institutions will be able to exert effective political pressure on online platforms.

Q: Can the Digital Services Act be seen as part of the EU’s broader strategy for technological sovereignty, alongside the Digital Markets Act, the Artificial Intelligence Act and data regulations?

A: I am not among those who believe that technological sovereignty can be achieved solely through legislation and fines. In today’s world, especially in the field of technology, it is difficult to speak of full national or even European sovereignty. In the European Parliament, we regularly discuss whether Europe can independently produce all the semiconductors it needs – for now, the answer is no. The same applies to creating artificial intelligence models on the scale of ChatGPT.

What is more important, however, is that this is not necessarily a problem. Europe must find its own model of technological development, which does not have to follow the American one. It is crucial that traditional European industries do not miss this technological wave and modernize.

It is a mistake to think that the aim of this legislation is to limit the presence of major American technology companies in Europe. There is no way to make people stop using platforms such as Facebook, nor is that our goal. Our aim is for these platforms – Facebook, X and the others – to comply with European rules.

A few weeks ago, French President Emmanuel Macron raised the question of why we allow external companies to exert such strong influence over our children. For me, it does not matter whether these companies are American or not. The fact that children spend hours on these platforms means that we must understand how their algorithms work.

I often give an extreme but illustrative example. There are countries where technology is surprisingly well developed, and one of them is Iran. There, artificial intelligence is used to track whether women comply with Sharia requirements for covering their hair and faces, including while driving. This leads to the imposition of daily penalties – from fines to imprisonment. In the European Union, the use of artificial intelligence for facial recognition is strictly limited, precisely because personal freedoms and rights are a core part of European identity. This understanding sets us apart not only from Iran and China, but to some extent also from our partners in the United States.