In a so-called Digital Omnibus – a term adopted to mean a streamlining and simplifying of rules – the European Commission is set to propose on 19 November changes to the European Union’s data protection and artificial intelligence regulatory frameworks. The Omnibus is part of a broader strategy to reduce regulatory burden and increase competitiveness within the European economy.
For the EU digital sector, the proposal could be bad news. As with previous Omnibus proposals, evidential underpinning seems to be absent, with the Commission claiming that proposed changes have only a “technical nature” (the legality of such a claim is controversial). It is unclear therefore whether the proposal will support growth while maintaining standards of protection.
The Digital Omnibus will seek to boost AI market performance by relaxing controls on data use, giving AI companies a freer hand. It could broaden access to data for companies, a vital issue for AI development, while providers of potentially high-risk AI applications might no longer be required to register their systems in an EU public database if they self-assess them as not high-risk, making it harder to validate their assessments.
The reform could expand the definition of ‘non-personal’ data, putting it beyond the scope of personal data protection. This might be done for pseudonymised data, from which the original data subject supposedly cannot be reidentified by the data processor – though another processor might be able to do so.
The Commission might also propose to allow the use of sensitive information for the development and operation of AI systems, provided that such information is not disclosed directly. However, indirect algorithmic inference can be easily performed and is often accurate. Gender, for example, can be inferred through salary or browsing history. Algorithms have been found to accurately identify sexual orientation from facial images.
Companies might also be allowed to extract data directly from users’ devices if certain conditions are met (eg a smartphone manufacturer can maintain that it needs to harvest data from users for security purposes). The Omnibus could also make it more difficult for data subjects to exercise their rights to access their data, to have it ported or to object to processing, if the data controller can demonstrate that the purpose of the subject’s request is not limited to the protection of personal data. This could, for example, undermine workers’ access to data in case of a legal dispute.
The risks associated with these ideas are substantial. If they materialise in the Commission’s proposals, they should be rejected for the absolute harm they could cause. Furthermore, it is not clear that such proposals would offer a straightforward economic return. Data fairness is undoubtedly costly. For example, removing gender bias from AI prediction models in e-commerce leads to an 8%–10% increase in costs: if companies cannot use gender to predict consumer behaviour, they lose a tool to increase profits. Yet, privacy protection increases consumer trust, leading to an expansion and stabilisation of market demand, with possible positive effects on innovation.
Furthermore, competitiveness is a relative concept: it depends on enhancing Europe’s economic performance compared to global rivals. The main beneficiaries of the Omnibus, however, might be large US tech firms, which already exercise a strong hold over European users. Even if the absolute performance of European companies improves, the gap with the US in AI may widen.
The Omnibus seems likely to include some common-sense amendments: establishing a single point of entry for companies to report cybersecurity incidents or data breaches, for example, and extending certain regulatory privileges to small and medium-sized companies, including simplified technical documentation or lower fines. However, the root causes of the relatively poor performance of European AI markets are deep. EU regulation is unlikely even to be the main factor. Companies see uncertainty, lack of skills, problems accessing finance and national fragmentation as more significant barriers to investment. Simply reducing company reporting obligations will not give the EU a fresh start.
The Commission could take a different approach. EU privacy laws certainly need reform, but of a structural kind. For example, the use of the notion of ‘consent’ to data processing may be outdated, and could be superseded by the establishment of an enforcement system that guarantees protection to users all the time, regardless of their consent preferences (one reported Digital Omnibus proposal would require automated privacy signals from browsers to overcome cookie consent fatigue; this would be a positive step).
Structural reform, however, cannot be rushed. It requires a genuinely participatory process, supported by reliable evidence, in line with the Commission’s guidelines on better regulation. The Digital Omnibus, unfortunately, looks unlikely to deliver this.
A Bruegel paper by the author on efficiency and distribution in the EU digital deregulatory strategy is forthcoming.