South Korea has become the first country to enact a comprehensive AI safety law and regulate its use at the legislative level.

The regulation, titled “Basic Law on AI Development and Building a Foundation for Trust,” took effect on Thursday, January 22.

According to the Ministry of Science of South Korea, this is the world’s first comprehensive AI-regulation law that unites regulatory requirements into a single systemic framework rather than a collection of separate legal norms.

Under the law’s provisions, companies and AI developers are obliged to fight deepfake content and misinformation that may be created by AI models, and the government has the authority to impose fines or initiate investigations for violations.

Key Provisions of the Law

The law introduces the concept of “high-risk AI,” which applies to systems that can affect users’ lives and safety, including in hiring, credit decisions, and the provision of medical advice.

Entities using such high-risk models must clearly inform users about the use of AI and bear responsibility for the safety of their services.

Content created using AI models must contain watermarks indicating its artificial origin.

Furthermore, the law obliges AI service providers to inform users about the use of artificial intelligence and ensure user safety.

The regulation also requires establishing branches in South Korea for tech companies if their services serve more than 1 million users in the country or generate revenue exceeding 10 trillion won (approximately $6.8 million). Another criterion is annual profits above 1 trillion won (about $681 million).

The existence of such requirements is confirmed by companies including OpenAI and Google.

Violations of the law’s provisions can result in fines of up to 30 million won (more than $20,000), but a grace period of up to one year is introduced to allow businesses to adapt to the new rules.

The implementation of the law reflects efforts to strengthen regulatory oversight of AI and may influence the global landscape of regulatory standards in the future.