From corporations to governments and public policy development, harnessing the value of data is a critical issue. There is a perception that pre-existing tools are inadequate to tackle certain issues, particularly in emerging technologies.

Diverse approaches to data regulation are being adopted globally, from free market self-regulation to stringent and prescriptive ex ante legislation. The picture is made all the more complex by the different business models across digital platforms. This chapter provides an introductory overview of the global data regulatory initiatives – beyond just privacy and data protection – in Europe, the Americas and Asia.

Europe

One of the key strategic objectives of the European Commission (EC) in its Strategic Plan 2020–2024 was to create a ‘Europe fit for a digital age’. In 2020, it launched the shaping Europe’s digital future (SEDF) strategy, outlining its vision for the European Union to embrace an increasingly digital world. One of its three strategic aims is to foster a fair and competitive digital economy. Since 2020, the European Union has initiated an extensive legislative agenda directed at the digital economy. Key policy areas of the SEDF include digital services, artificial intelligence (AI), cybersecurity and data sharing.

The SEDF introduced a swathe of new laws that sit alongside existing regulations such as the European Union’s data protection law, Regulation (EU) 2016/679 (the General Data Protection Regulation (GDPR)), which became applicable in May 2018. The interaction between the GDPR and the new laws created as part of the SEDF is nuanced and complex. In theory, the new laws rely on the GDPR for protections that apply to personal data. In practice, however, fundamental concepts can cause tension, such as the relationship between the consent requirements under Regulation (EU) 2022/1925 (the Digital Markets Act (DMA)) and consent as a legal basis for processing personal data. Effective enforcement of the GDPR is also a prominent topic, especially as personal data becomes increasingly relevant to other areas of law, such as competition. Introducing many new, and expanding the scope of some, regulatory mandates will create challenges of competency and add complexity to effectively enforcing the GDPR in harmony. The GDPR is a crucial thread in the web of data regulation that is emerging and its relationship with the laws under the SEDF is a running theme.

Digital services package

The digital services package proposed two laws to regulate the digital space: the DMA and Regulation (EU) 2022/2065 (Digital Services Act (DSA)).

The aim of the DMA is to enhance the fairness and contestability of EU digital markets. The Act sets out ex ante requirements on gatekeepers. The EC designated the first set of gatekeepers in September 2023 (namely, Alphabet, Amazon, Apple, ByteDance, Meta and Microsoft) in relation to specific core platform services (CPSs). Booking.com was designated in 2024.

The obligations and prohibitions set out by the DMA address interoperability, data combination, data access by business users or rivals, use of platform data, transparency in advertising and self-preferencing. Data-specific rules include requiring gatekeepers to give business users access to data on their use of relevant CPSs, including data on end user engagement on that platform, and to refrain from using non-public data of business users, collected by a CPS, to compete against business users. Gatekeepers must also obtain end users’ consent to:

process, for advertising purposes, personal data of end users using third parties’ services that make use of the gatekeepers’ CPS;combine personal data from a CPS with personal data from:any further CPS or other services offered by the same gatekeeper; orthird-party services;cross-use personal data from a CPS with other services offered by the gatekeeper; orsign in end users to multiple services offered by the gatekeeper to combine personal data.

The DMA also imposes obligations to enable ‘data portability’ – gatekeepers must provide ‘effective portability’ of data provided or generated by the user when using gatekeepers’ CPSs, including through providing continuous and real-time access to the data.

We expect 2025 to be a year of active enforcement of the material provisions of the DMA. The EC spent 2024 both on designating additional companies as gatekeepers (Booking.com) and initiating the first non-compliance investigations of Alphabet, Apple and Meta, and proceedings to specify how Apple is to implement certain obligations. On 23 April 2025, the EC found that Apple breached its anti-steering obligation under the DMA and that Meta breached the DMA obligation to give consumers the choice of a service that uses less of their personal data. The EC fined Apple €500 million and Meta €200 million. The EC also closed its investigation of Apple’s user choice obligations and found that Meta’s online intermediation service Facebook Marketplace should no longer be designated under the DMA, as it no longer meets the DMA threshold. There are no indications that additional gatekeepers will be designated any time soon, if only for a lack of EC staffers. According to public reporting, the EC is considering opening infringement proceedings against Amazon regarding self-preferencing allegations, which will be for the new Competition Commissioner, Teresa Ribera, to decide.

One unknown is how the DMA will interact with antitrust enforcement (which continues to apply in parallel), with national (non-competition) regulation (which can go beyond the DMA) and with the GDPR. In theory, data protection regulators, the EC and national antitrust agencies could probe data processing activities under the GDPR, the DMA or antitrust rules, respectively. In relation to the interplay between the DMA and the GDPR, one noteworthy development is that the EC offices in charge of the enforcement of the DMA, the European Data Protection Board (EDPB), announced that they intend to provide guidance to ensure coherent application of the two regulations. The EDPB’s involvement signals that forthcoming guidance on the DMA may apply stringent GDPR principles, potentially increasing scrutiny for businesses and affecting their compliance strategies.

The DMA is narrow in scope and oriented towards addressing the perceived failures of competition tools in the digital space, whereas the DSA is broader and primarily focuses on content moderation and protecting consumers online. The DSA builds on Directive 2000/31/EC (the eCommerce Directive) and applies to intermediary services, including online platforms and search engines operating in the European Union. Requirements are cumulative, according to the nature and size of the company, with the most onerous obligations imposed on the online platforms and search engines that have more than 45 million users in the European Union. Broadly, requirements for online platforms and search engines focus on increased user transparency, particularly in relation to advertising and content removal through notice and takedown, and on the use of specific categories of data, including children’s data. The new EC for 2025–2029 has indicated that it will strengthen certain consumer protection requirements in a new Digital Fairness Act, which will target particular concerns, such as dark patterns and addictive design.

Artificial intelligence

Regulation (EU) 2024/1689 (the Artificial Intelligence Act (AI Act)) is the first legislation of its kind to create cross-sector, legally binding rules on the development and deployment of AI. Inspired by the existing EU product safety framework, the AI Act adopts a horizontal and largely risk-based approach. Its aim is to increase trust in AI and promote innovation while protecting consumers and safeguarding European values and fundamental rights.

The AI Act primarily applies to providers and deployers of AI systems and general purpose AI models in the European Union, or whose output is used in the European Union. Some obligations apply to other parties, such as importers. As of 2 February 2025, applications of AI that pose an unacceptable risk are prohibited in the European Union. Most obligations fall on high-risk AI (e.g., risk management systems, human oversight and data quality requirements). Transparency obligations will apply to AI systems that pose specific transparency risks (e.g., deep fakes). Requirements for general purpose AI models are tiered according to perceived risk.

AI continues to be a focus area for antitrust regulators. In May 2024, the EC High-Level Group for the DMA issued a public statement on AI and the DMA that highlighted key concerns, such as access to data (including training and testing and validation data). To address these challenges, the High-Level Group committed to exchange enforcement experience and regulatory expertise as relevant to the implementation and enforcement of the DMA, as well as with regard to AI.

Several EU Member States have also been actively pushing for the inclusion of AI within the scope of the DMA. On 10 and 11 February 2025, at an AI conference in Paris, representatives of Germany and the Netherlands advocated for modifying the DMA to cover certain AI services or obligations. A joint draft paper by France, Germany and the Netherlands, summarised by public reporting, proposes broadening the scope of the DMA beyond AI-powered functionalities integrated within existing core platform services (which are already in scope). It urges the EC to investigate designating certain cloud service providers under the qualitative thresholds of the DMA, given the importance of computing power for large AI models.

More generally, in July 2024, the EC, the United Kingdom’s Competition and Markets Authority (CMA) and the United States’ (US) Department of Justice (DOJ) and Federal Trade Commission (FTC) released a joint statement highlighting competition risks in generative AI and other markets. These perceived risks included (1) concentrated control of key inputs (including data) hands of a small number of companies, (2) entrenching or extending market power in AI-related markets and (3) arrangements between key players. Several agencies have published reports, highlighting areas for further scrutiny, particularly with regard to data access and use.

Authorities such as the EC, the German Federal Cartel Office (Bundeskartellamt, BKartA) and the United Kingdom’s CMA are also actively scrutinising investments in AI companies, the hiring of employees from AI companies and partnerships with AI companies – at least, they are attempting to. Many of the partnerships do not meet jurisdictional thresholds (as conventionally interpreted) and regulators have been grappling with whether these transactions are reviewable, seemingly on a case-by-case basis.

Cybersecurity strategy

The European Union’s cybersecurity strategy is particularly broad and includes three key pieces of legislation. The aim is to increase the resilience and technological sovereignty of the European Union, while paving the way for Europe to be a cybersecurity leader.

Directive (EU) 2022/2555 (Network Information Systems 2 Directive (NIS2)) replaced Directive (EU) 2016/1148 (the NIS Directive) of 2016, which was the first of its kind in the European Union. Rapid evolution of the landscape and fragmented implementation of the NIS Directive led to an update. The focus of NIS2 remains on critical infrastructure, but more industries are caught. As NIS2 is a directive, Member States were required to transpose it into national law by 17 October 2024. Each Member State was also required to develop a list of essential (e.g., cloud services) and important (e.g., social networking platforms) entities within scope by 17 April 2025. To date, however, many Member States have not met these deadlines.

Requirements focus on building cybersecurity risk management frameworks and reporting incidents in each Member State.

Regulation (EU) 2024/2847 (the Cyber Resilience Act (CRA)) sets out new rules for software and hardware products and their remote data processing solutions. Its central aim is to improve EU resilience by setting mandatory minimum security standards for products offered in the European Union. Requirements apply to manufacturers, distributors and importers of products with digital elements. Products that are explicitly identified as ‘critical’ will be subject to additional requirements. In general, the obligations relate to cybersecurity risk assessments, conformity assessments and documentation as well as incident reporting. The first requirements will start to apply in September 2026.

Finally, Regulation (EU) 2022/2554 (the Digital Operational Resilience Act (DORA)) targets the financial services sector. Building on NIS2, DORA creates pan-EU security risk management requirements for companies in the financial services sector and their critical information and communications technology (ICT) service providers. DORA applies directly to regulated financial entities and their ICT service providers that have been designated as ‘critical’ by EU authorities. It also requires financial entities to include specific contractual terms with all their ICT service providers, creating an indirect obligation for all organisations that offer ICT services to regulated financial entities. Requirements involve ICT and third-party risk management, incident reporting, resilience testing, threat monitoring and intelligence sharing.

European data strategy

The vision of the European strategy for data is to create a single European data space in which data of all types can flow freely and be used across all sectors within the Union. Creating clear and harmonised rules on data access and sharing will help enable access to currently siloed data that can be used to create a dynamic and competitive ecosystem of innovation. Alignment between all EU digital regulations will be challenging but critical to the success of the single European data space; for example, the single European data space relies on the GDPR to protect personal data.

Regulation (EU) 2022/868 (the Data Governance Act (DGA)) and Regulation (EU) 2023/2854 (the Data Act) form the backbone of the single European data space. The DGA encourages more data sharing by public bodies and creates a model for data sharing through trusted data intermediaries. The Data Act regulates relations between actors in the data economy and aims to clarify who can use and access data for which purposes in the European Union. The main data-sharing provisions in the Data Act require manufacturers of connected products (e.g., home assistants) and related services (e.g., operating systems in connected cars) to grant access to data generated by their products and related services. Additional provisions cover business-to-public sector data sharing in public emergencies, cloud switching by customers and international data transfer restrictions for non-personal data.

To complement the common horizontal framework, the EC proposes to establish nine specialised data spaces (e.g., for agriculture and industrial manufacturing). These data spaces will address challenges specific to a specific sector. At the time of writing, the Framework for Financial Data Access (FIDA), which focuses on financial data held by financial institutions, is being negotiated. Regulation (EU) 2025/327 (on the European Health Data Space (EHDS)) entered into force on 25 March 2025 and will start to apply as of 26 March 2027. The data shared under both instruments will be subject to strict reuse conditions and standards will be developed (e.g., technical specifications for data sharing).

EU Member States

While the DMA is intended to harmonise the rules applicable to gatekeepers and prevent fragmentation of the internal market, national initiatives have paralleled the DMA in a number of respects. Germany has amended its German Act against Restraints of Competition (ARC) to address digital and data issues. It empowers the BKartA to designate certain companies as having ‘paramount significance for competition across markets’. The BKartA can order such companies to cease engaging in prohibited types of conduct, such as denying data portability or processing competitively relevant data of third parties (both business-to-customer and business-to-business data) in a way that appreciably increases market entry barriers (as well as requiring third parties to accept terms and conditions that permit the processing). To facilitate coherent and complementary implementation of the German rules and the DMA, the BKartA has indicated that it will only apply Section 19A of the ARC to services to services and conduct that are not covered by the DMA. To date, the BKartA has designated five companies under Section 19A (Alphabet/Google, Meta/Facebook, Amazon, Apple and Microsoft) and it is expected that the BKartA will seek to use its additional powers with respect to these companies during 2025. In cases that targeted the same conduct as under the DMA (but addressed different services), the BKartA stressed that it cooperated closely with the EC (and, in some cases, with the German data protection authorities). This is to ensure there is no conflict between German enforcement under Section 19A of the ARC and the DMA, as both regimes are actively implemented in practice.

United Kingdom

Since exiting the European Union in 2020, the United Kingdom has adopted and implemented its own regulatory agenda. In a push to create an adaptive, business-friendly and innovation-focused environment, the Conservative government refrained from legislating extensively, most notably on AI. A new government was elected in July 2024 and although it signalled a desire to introduce legislation to regulate the most powerful AI models, in its recent AI Opportunities Action Plan, the Labour government did not indicate that it intends to imminently bring forward any legislation. Effective collaboration between sectoral regulators is central to the UK approach to regulating the digital sphere. In particular, the Digital Regulation Cooperation Forum brings together four independent regulators – privacy, communications, financial markets and competition – to cooperate to deliver a coherent and informed approach to digital regulation.

Data regulatory efforts in the United Kingdom include online safety, competition and reform of the GDPR as retained in UK law (UK GDPR). The Product Security and Telecommunications Infrastructure (Security Requirements for Relevant Connectable Products) Regulations 2023 impose requirements on manufacturers of certain connected products designed to address security risks, such as banning universal default passwords and easily guessable default passwords.

The UK government has indicated its intention to reform the Network and Information Systems Regulations. A Cyber Security and Resilience Bill is expected to be introduced to the UK Parliament in 2025 but, to date, no drafts have been published.

The Online Safety Act 2023 aims to tackle online harm by creating a framework of duties applicable to user-to-user services and search services. Additional obligations will apply to a small number of services (e.g., to be identified by reference to the number of UK users and features of their service). All services in scope need to implement measures to prevent or minimise the risks to individuals of encountering illegal content and mitigate the risk of harm caused by illegal content. Many services also need to take steps to protect children from online content that could be harmful. On 16 December 2024, Ofcom published its guidance on illegal content. Services in scope were required to complete an illegal harms risk assessment by 16 March 2025. Obligations to comply with the illegal harms duties are enforceable as of 17 March 2025.

The Digital Markets, Competition and Consumers Act 2024 (DMCC Act) establishes ex ante rules through platform-specific tailored codes of conduct. The Act entered into force on 1 January 2025. It empowers the CMA to designate large technology companies as having strategic market status (SMS, similar to the DMA’s ‘gatekeeper’ designation) with respect to digital activities linked to the United Kingdom. The CMA has the power to, inter alia, prevent designated companies from using data unfairly, and require these companies to allow greater access to data. In the first month with its new powers, the CMA announced three SMS investigations into two companies (Google and Apple).

The aim of much of the Data (Use and Access) Bill is to promote innovation through reform of the UK GDPR.

AmericasUnited StatesData privacy

In the United States, data regulation has historically been sector-specific, with data privacy and security requirements varying by industry; for example, the Children’s Online Privacy Protection Act addresses children’s privacy, the Health Insurance Portability and Accountability Act addresses privacy of certain health information, and the Gramm-Leach-Bliley Act addresses financial privacy. In recent years, there has been support from both consumer groups and industry groups for comprehensive broadly applicable privacy regulation, though Congress has repeatedly failed to pass any such legislation.

States have been active in filling in the gap at the federal level. At the time of writing, 20 states have enacted comprehensive data privacy laws, which take inspiration from the EU GDPR to varying degrees. Additionally, many states have passed laws relating to specialised topics, including minors’ online privacy and safety, health privacy and AI.

Access to data

Despite growing calls for new legislation to address the role of data in competition, there has been little success in enacting laws that target potential anticompetitive collection and use of data. Although several bills have been introduced in recent years, all have continued to stall in Congress; for example, the Digital Consumer Protection Commission Act, introduced in the Senate in July 2023, would create a new regulatory commission to regulate online platforms. The proposed commission would have concurrent jurisdiction with the FTC and the DOJ, and would oversee and enforce rules to promote competition and protect privacy. No vote was held on the bill.

Another bill, the Augmenting Compatibility and Competition by Enabling Service Switching Act, seeks to mandate data portability by requiring companies meeting certain thresholds to enable users to port data between platforms. Also introduced multiple times in the past in both houses of Congress, the American Innovation and Choice Online Act would prohibit dominant platforms from self-preferencing and disadvantaging rivals. The bill focuses specifically on data collection and use; for instance, it would prevent covered platforms from using non-public data generated by business users to advantage the platform’s own products. The bill was successfully passed out of the Senate Judiciary Committee in 2022 but did not reach a floor vote.

The FTC under the Biden administration called for greater antitrust scrutiny in this area; for example, it voiced privacy and antitrust concerns in respect of generative AI, arguing that it favoured dominance and scale, and could lead to further consolidation. The new administration’s views diverge from its predecessor, however, with President Trump revoking the Biden Executive Order on Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence, and committing to sustaining and enhancing the country’s dominance in AI to promote economic competitiveness and national security.

Although legislation with the aim of addressing potential issues arising from the collection and use of data has not found traction, federal legislators have continued their efforts, with many proposals addressing generative AI outright; for example, The AI Foundation Model Transparency Act, proposed in the House in 2023, would direct the FTC, in consultation with the National Institute of Standards and Technology and the Office of Science and Technology Policy, to set standards addressing the information that high-impact foundation models must provide to the FTC and make available to the public. The US House Artificial Intelligence Task Force released a report in 2024 recommending future regulations. In February 2025, the US House Committee on Energy and Commerce created a Privacy Working Group to explore a framework for legislation that maintains the country’s leadership in AI. It is unclear whether and how the new US administration will consider or carry through these policy directives. The United States has also restricted access by countries of concern to individuals’ data and US government data. On 28 February 2024, the White House issued an Executive Order on Preventing Access to Americans’ Bulk Sensitive Data and United States Government-Related Data by Countries of Concern, and the DOJ issued a final rule implementing this Order. This is effective as of 8 April 2025 absent any change pursuant to congressional review. Further, the FTC may now enforce the Protecting Americans’ Data from Foreign Adversaries Act, which prohibits data brokers from sharing sensitive personal information with foreign adversaries or entities controlled by foreign adversaries.

Canada

Although there is no regulation in force that specifically addresses data and antitrust issues, the Canadian government has been active with a suite of legislation addressing sectors such as retail payments, open banking and healthcare. Further, the Competition Act was amended to tackle concerns in digital markets, and further consultations are expected on additional reforms. Similar to the approach in the United Kingdom, the main regulators form a Digital Regulators Forum covering competition, privacy and telecommunications.

At the federal level, the Canadian parliament is reviewing Bill C-27, a package that would enact three new statutes: The Consumer Privacy Protection Act, The Personal Information and Data Protection Tribunal Act and The Artificial Intelligence and Data Act. Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) provides consumers with certain data protections, including consent, collection, use and deletion requirements. Many provinces have privacy laws that are substantially similar to PIPEDA.

At the provincial level, Quebec continues to evolve its data privacy regime. In May 2024, the government introduced new requirements that organisations must follow when anonymising records containing personal information. Additionally, the final provisions of the Quebec Privacy Act, which relate to data portability, came into effect in September 2024.

Latin America

Data regulation is quite varied in Latin America. In recent years, some countries have revisited existing data privacy frameworks to more closely align their laws with the EU GDPR. Brazil is the leading jurisdiction for data regulation in the region, having approved the General Personal Data Protection Law (LGPD) in 2019 and introduced sanctions for data privacy violations.

The National Data Protection Authority (ANPD) and the Brazilian competition regulator, CADE, signed a technical cooperation agreement in 2021 that targets conduct that is harmful to the economic order and promotes a culture of free competition in services that require personal data. The press release stressed that the initiative arose from recognition of the economic importance of personal data. A draft bill (No. 2,768/2022) establishing a regulatory framework for digital platforms, which was proposed in November 2022, is similar to the European Union’s DMA and aims to ensure more fair and competitive digital markets. The proposal would make Brazil’s telecommunications regulator, Anatel, the enforcer. The bill is currently under discussion in Congress.

In January 2024, the government launched a public consultation on the regulation of digital platforms, including whether new regulations are necessary and potential changes to the competition rules. Issues around control and use of large databases were expressly identified. The consultation also noted that part of the regulatory challenge lies in identifying problems precisely and seeking balanced and proportionate responses that do not undermine benefits.

Asia-Pacific

There have been a number of legislative and regulatory developments on the interplay of data protection and antitrust in the Asia-Pacific region in recent years, particularly in Australia, China, Japan and South Korea. The use of data has been considered in regulatory tools and legislation in Japan and Korea. Australia has carried out several in-depth inquiries into digital platforms and services, homing in on the use of data. China has updated its anti-monopoly laws and data regulations to address concerns about monopolistic behaviour and the use of data by large technology platforms.

Australia

The Australian Competition and Consumer Commission (ACCC) has been very active in examining digital markets and data-related issues, undertaking three separate inquiries (Digital Platforms Inquiry between 2017 and 2019, Digital Advertising Services Inquiry between 2020 and 2021, and Digital Platform Services Inquiry between 2020 and 2025). These may result in a legislative response.

The final report of the Digital Platforms Inquiry proposed a review of the Privacy Act 1988 (Cth) (the Privacy Act). In September 2023, the Australian government confirmed that privacy reforms should ‘bring the scope and application of the Privacy Act into the digital age by recognizing the public interest in protecting privacy and exploring further how best to apply the Privacy Act to a broader range of data and entities which handle such data’.

Interim reports of the Digital Platform Services Inquiry issued since February 2020 addressed the interplay of data protection and competition law. In the fifth interim report, the ACCC recommended new targeted competition measures for certain digital platforms, entailing legally binding codes of conduct prohibiting certain behaviour and addressing data advantages, including data portability, access and separation measures. The ACCC stressed that any measures should safeguard consumers’ privacy and should follow reforms to the Privacy Act. The final report concluding the five-year inquiry is expected to be released in mid-2025. As part of this, the ACCC also proposes to examine potential competition issues relating to generative AI, considering that large language models may tend towards concentration, like traditional digital platforms, benefiting from a positive feedback loop involving, among other things, access to large volumes of high-quality user data.

Some consumer protection and data-related initiatives are also noteworthy, such as the Consumer Data Right project led by the Australian Treasury.

China

China’s Anti-Monopoly Law (AML), which was last amended in 2022, expressly prohibits companies from using, inter alia, data, algorithms, technologies, capital advantages or platform rules to engage in monopolistic behaviour such as horizontal and vertical monopoly agreements, and abusing market dominance. Several implementation rules and guidelines have been released to provide more specific guidance on competition and data issues:

Monopoly agreements: The 2021 Anti-Monopoly Guidelines for Platform Economy Industries (the Platform Guidelines) prohibit companies from using data, algorithms, platform rules and other methods to coordinate conduct. Similar requirements are included in the 2023 Regulations on Prohibiting Monopoly Agreements.Determination of market dominance and abusive behaviour: The 2023 Regulations on Prohibiting the Abuse of Dominant Market Positions and the Platform Guidelines provide that (1) the ability to control and process data, and to impede access to data, are factors in determining dominance, (2) companies are prohibited from abusing dominance through the use of data, algorithms, technologies and platform rules, and (3) protection of data security can be a justifiable reason for restrictive practices by dominant companies. In the Sumscope case, known as China’s first antitrust enforcement case targeting the data sector, the authority provides a thorough analysis of market definition and the establishment of dominance. In Sumscope, abusive behaviour was established on improper controls or restrictions on the data that a company has the right to use (but not necessarily possess ownership). Echoing the national policy aimed at facilitating the free flow and efficient use of data, this case highlights the increasing scrutiny of data-specific antitrust issues.Merger review: The Regulations on Review of Concentrations of Undertakings and the Platform Guidelines specify that the following should be considered: (1) the ability to control and process data and data interfaces; (2) control over data in terms of impact on market entry; and (3) the inappropriate use of consumer data to harm the interests of consumers. In addition, the State Administration for Market Regulation may impose data-related remedies, for example, the divestiture of data; opening up platforms and data that constitute an essential facility; and accepting compatibility and interoperability commitments, among other things.

Similar rules are also recognised by, and included in, the ‘Interpretation of Certain Issues Relating to the Application of the Law in the Trial of Monopoly-Related Civil Disputes’, released by the Supreme People’s Court in June 2024.

More broadly, the government has either proposed or adopted data-related regulations and guidelines in recent years, including a new Anti-Unfair Online Competition Regulation, Platform Classification Regime, Data Security and Cybersecurity Laws, Personal Information Protection Law and Regulations on Network Data Security Management.

Japan

Competition in data-driven markets is governed by Japan’s general competition law, the Antimonopoly Act, which has been updated in recent years to address digital markets – and related data concerns – more effectively. The Japan Fair Trade Commission (JFTC) has created special task forces and bodies, including the inter-ministry Headquarters for Digital Market Competition (HDMC), to address competition, privacy and data issues in digital markets. It has also been active in conducting market studies, some of which have led to new regulatory initiatives and legislative amendments, including new laws targeting conduct by digital platforms (the Act on Improving Transparency and Fairness of Digital Platforms and the Act for the Protection of Consumers who use Digital Platforms).

The 2021 Report of the Study Group on Competition Policy for Data Markets focused heavily on competition considerations relevant to data, including issues such as free and fair access to data, portability and interoperability and privacy protection. Building on the HDMC’s latest report on mobile ecosystems in June 2023, the Bill for the Act on Promotion of Competition for Specified Smartphone Software was passed by the National Diet in June 2024 and promulgated later the same month, aiming to develop a competitive environment for mobile software. The Bill, among other things, prohibits certain mobile software providers, as designated by the JFTC, from engaging in unfair usage of data. It also requires them to disclose the terms under which they use data obtained from mobile OS, application (app) stores and browsers, and to implement data portability measures to facilitate the transfer of data upon users’ request.

South Korea

In December 2023, the Korea Fair Trade Commission (KFTC) proposed new rules to regulate large digital platforms in a similar fashion to the EU DMA: the Platform Competition Promotion Act (PCPA). Very little information has been released but a press release made clear that, unlike the DMA, platforms will be able to put forward efficiency justifications and evidence that the conduct is not anticompetitive. In September 2024, due to various concerns and controversies surrounding the PCPA, the KFTC stepped back from its plan to legislate the PCPA. Instead, it announced that it would address anticompetitive practices in the online platform market through revisions to the existing Monopoly Regulation and Fair Trade Act of Korea (MRFTA).

On 29 April 2024, the KFTC announced that it had finalised amendments to the Merger Review Guideline, which took effect from 1 May 2024. The KFTC noted that a merger involving digital service providers is likely to result in an increase in the number of service users or the amount of data held by the merged entity, which may reinforce the merged entity’s dominance and raise entry barriers. In addition, the KFTC adds data as one consideration when assessing efficiency gains in mergers involving online platforms; in other words, whether additional data acquired as a result of the merger may be used to create innovative services or to reduce costs of production or distribution.

Attempts to tackle the growing interplay between data and antitrust are also reflected in previous revisions or the publication of guidelines to the MRFTA concerning merger control and abuse of dominance cases.

Open data initiatives

In theory, if implemented and leveraged correctly, open data initiatives could ensure enhanced data security and increased competition, complementing the aims of the data regulatory initiatives.

Open data is free for anyone to access, use and share. The concept is for data to be easily accessible for business, civil society and governments. The benefits of open data are wide-ranging and cross-sectoral, from improving efficiency of public services to enhancing social participation.

At a global level, open data initiatives are well established. Organisations such as the World Bank and the Organisation for Economic Co-operation and Development have extensive and ongoing open data programmes, involving data on topics ranging from education statistics to foreign direct investment. At a regional level, the European Union and the United Kingdom both have open databases that enable users to find data published by public authorities. These form a key part of the European Data Strategy, discussed above. By creating and expanding the scope of mandatory data sharing by public and private sectors, the development of common European data spaces (such as the EHDS and FIDA) is designed to complement and build on the existing EU Open Data initiatives, and the Open Data Directive. In the United States, this is mirrored in Project Open Data, an online, public repository published on GitHub and intended to foster collaboration and to promote continual improvement of the government’s Open Data Policy.

From a corporate perspective, Microsoft, SAP and Adobe took the lead with their 2018 partnership on the Open Data Initiative, a common data model designed to unlock data from silos and enable portability and to enhance controls and processing for customers, enabling more efficient processes and enriched insights. More broadly, GitHub has championed open data collaboration and hosts more than 800 million open data files. It promotes the use of licence-free, public standards for encoding and storing data.

Acknowledgements

The authors would like to thank Hattie Watson (law clerk), Michelle Zang, Alexandra Keck and Rebecca Weitzel Garcia (associates) of Wilson Sonsini Goodrich & Rosati, and Bivio Yu and Huihui Li (partners) of Fangda Partners for their contributions to this chapter.

Endnotes