{"id":34908,"date":"2026-05-06T22:35:08","date_gmt":"2026-05-06T22:35:08","guid":{"rendered":"https:\/\/www.europesays.com\/canada\/34908\/"},"modified":"2026-05-06T22:35:08","modified_gmt":"2026-05-06T22:35:08","slug":"openai-violated-canadian-privacy-laws-federal-and-provincial-watchdogs-say","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/canada\/34908\/","title":{"rendered":"OpenAI violated Canadian privacy laws, federal and provincial watchdogs say"},"content":{"rendered":"<p>Privacy commissioners say OpenAI broke the rules by scraping Canadian data to develop and train ChatGPT.<\/p>\n<p>&#13;<\/p>\n<p>Commissioners from four of Canada\u2019s privacy watchdogs have found that OpenAI violated Canadian privacy laws while developing and training its early models of ChatGPT.\u00a0<\/p>\n<p>OpenAI gathered \u201cvast amounts of personal information,\u201d potentially including details like health conditions, political views, or information about children.<\/p>\n<p>At a news conference on Wednesday, Philippe Dufresne, Canada\u2019s privacy commissioner, was joined by his provincial counterparts from British Columbia, Alberta, and Qu\u00e9bec to announce the findings of a joint investigation into the tech giant. The investigation examined how OpenAI sourced training data for its early, GPT-3.5 and GPT-4 models, which included scraped content from publicly accessible internet sources like social media and blog posts, licensed third party sources like media outlets and stock image vendors, and user interactions with ChatGPT.<\/p>\n<p>Leveraging \u201cextensive written representations\u201d from OpenAI\u2019s legal counsel, interviews with OpenAI employees, internal testing on ChatGPT by the Office of the Privacy Commissioner (OPC), and publicly accessible sources like studies published by OpenAI and other AI experts, regulators focused on whether OpenAI had followed federal and provincial privacy legislation principles like consent, transparency, and data accuracy when collecting data.\u00a0<\/p>\n<p>Launched in 2023 on the heels of a <a href=\"https:\/\/www.priv.gc.ca\/en\/opc-actions-and-decisions\/investigations\/investigations-into-businesses\/2026\/pipeda-2026-002\/#toc2-1\" target=\"_blank\" rel=\"noopener nofollow\" title=\"\">complaint alleging OpenAI had collected, used, and disclosed personal information without consent<\/a>, the investigation came well before OpenAI <a href=\"https:\/\/betakit.com\/openai-says-it-would-have-reported-tumbler-ridge-shooter-to-police-in-hindsight\/\" target=\"_blank\" rel=\"noopener nofollow\" title=\"\">came under scrutiny in Canada<\/a> following a deadly mass shooting in Tumbler Ridge, BC. Families of the victims of that shooting are taking OpenAI to court; the company had banned the shooter\u2019s account for \u201cdisturbing content,\u201d yet did not tip off law enforcement about any potential dangers.\u00a0<\/p>\n<p>Following the Tumbler Ridge shooting, Canada\u2019s innovation minister, Evan Solomon, <a href=\"https:\/\/betakit.com\/openai-says-it-would-have-reported-tumbler-ridge-shooter-to-police-in-hindsight\/\" target=\"_blank\" rel=\"noopener nofollow\" title=\"\">spoke with OpenAI CEO Sam Altman<\/a>, saying the tech mogul expressed \u201chorror and responsibility\u201d regarding the shooting. After their conversation, OpenAI agreed to strengthen its \u201claw enforcement referral criteria\u201d and include Canadian mental health and law experts in its safety office\u2014where the company assesses threats and whether or not to inform police.<\/p>\n<p>No consent to use personal data<\/p>\n<p>At Wednesday\u2019s press conference, Dufresne noted that all four regulators found OpenAI had violated various federal and provincial privacy laws, including the federal <a href=\"https:\/\/laws-lois.justice.gc.ca\/eng\/acts\/p-8.6\/\" target=\"_blank\" rel=\"noopener nofollow\" title=\"\">Personal Information Protection and Electronic Documents Act<\/a> (PIPEDA), and its provincial counterparts in Alberta, BC, and Qu\u00e9bec.\u00a0<\/p>\n<p>PIPEDA regulates how businesses collect, use, or disclose personal information during commercial activity. It operates on several \u201cfair information principles\u201d that include obtaining consent for data collection, among other stipulations. Parallel provincial legislation, like Alberta and BC\u2019s Personal Information Privacy Acts (PIPA) and Quebec\u2019s Law 25, mandate similar\u00a0 requirements.\u00a0<\/p>\n<p>Among their <a href=\"https:\/\/www.priv.gc.ca\/en\/opc-news\/news-and-announcements\/2026\/bg-info_openai_260506\/\" target=\"_blank\" rel=\"noopener nofollow\" title=\"\">key findings<\/a>, regulators concluded that OpenAI gathered \u201cvast amounts of personal information\u201d for use in training data. That data could potentially include sensitive information and details like health conditions, political views, or information about children.<\/p>\n<p>It also found the tech company did not obtain valid consent for the collection of personal information\u2014a key plank under PIPEDA and other Canadian privacy legislation\u2014and that there was not adequate transparency, with many users unaware their data was collected and used to train OpenAI\u2019s chatbot.\u00a0<\/p>\n<p><a href=\"https:\/\/betakit.com\/openai-says-it-would-have-reported-tumbler-ridge-shooter-to-police-in-hindsight\/\" rel=\"nofollow noopener\" target=\"_blank\">RELATED: Evan Solomon will meet\u00a0Sam Altman as OpenAI faces pressure over Tumbler Ridge response<\/a><\/p>\n<p>\u201cOur investigation determined that the manner in which OpenAI initially collected personal information from publicly accessible websites and licensed third-party sources to train the models was overbroad and therefore inappropriate,\u201d an <a href=\"https:\/\/www.priv.gc.ca\/en\/opc-actions-and-decisions\/investigations\/investigations-into-businesses\/2026\/pipeda-2026-002-overview\/\" target=\"_blank\" rel=\"noopener nofollow\" title=\"\">overview<\/a> of the investigation says. \u201cWe came to this determination considering the scale, nature, and varying levels of sensitivity of the personal information collected and used from those sources.\u201d<\/p>\n<p>The privacy watchdogs also found that OpenAI had not provided individuals with \u201can easily accessible and effective mechanism to access, correct, and delete their personal information,\u201d and that it released ChatGPT without having fully addressed known privacy risks and without data-deletion rules.<\/p>\n<p>A full accounting of the report and its findings can be found <a href=\"https:\/\/www.priv.gc.ca\/en\/opc-actions-and-decisions\/investigations\/investigations-into-businesses\/2026\/pipeda-2026-002\/#toc2-1\" rel=\"nofollow noopener\" target=\"_blank\">her<\/a><a href=\"https:\/\/www.priv.gc.ca\/en\/opc-actions-and-decisions\/investigations\/investigations-into-businesses\/2026\/pipeda-2026-002\/#toc2-1\" target=\"_blank\" rel=\"noopener nofollow\" title=\"\">e<\/a>.\u00a0<\/p>\n<p>OpenAI commits to changes<\/p>\n<p>Dufresne said that throughout the investigation, OpenAI engaged in good faith and took measures to address the regulators\u2019 concerns. As a result, the federal privacy office considers the investigation to be \u201cconditionally resolved.\u201d Qu\u00e9bec\u2019s Commission d\u2019acces a l\u2019information du Qu\u00e9bec has labelled the investigation as conditionally resolved on several points, but unresolved on the issue of consent. British Columbia and Alberta\u2019s findings label the investigation as unresolved under provincial PIPA requirements. Both provincial regulators noted OpenAI\u2019s efforts to improve compliance.\u00a0<\/p>\n<p>OpenAI has committed to several measures to address the regulator\u2019s concerns, including implementing a filtering tool to detect and mask personal information like names and phone numbers in publicly accessible datasets, facilitating corrections, enhancing correction and deletion protocols, and implementing a formal retention policy governing personal information.\u00a0<\/p>\n<p>The company has also committed to several time-sensitive conditions, linked to the publication of the watchdogs\u2019 report. They include:\u00a0<\/p>\n<p>Within three months, adding a notice to the signed-out web version of ChatGPT that tells users their chats may be reviewed and used to train models, and advising them not to share sensitive information.<\/p>\n<p>Within six months, making it easier to understand and use the data exports that it provides to users who request their personal information. The company will also better explain the avenues available to users who want to challenge the completeness, accuracy, or nature of the information provided.<\/p>\n<p>Within six months, confirming\u00a0 to the privacy commissioners\u2019 offices that it has implemented strong protection for future datasets that are retired and used only as historical references, so they are not used for active model development, and regularly review whether these datasets should still be kept.<\/p>\n<p>Within six months, testing protective measures for the minor family members of public figures, who are themselves not public figures, to ensure that the models refuse requests for their name or date of birth.<\/p>\n<p>The company will also provide quarterly reports to the Office of the Privacy Commissioner and provincial partners until these commitments have been met.<\/p>\n<p>It is unclear at this time what efforts need to be undertaken by the tech company to resolve Alberta and British Columbia\u2019s complaints.<\/p>\n<p>BetaKit reached out to OpenAI for comment on the report\u2019s findings, but it did not respond to our request by press time.\u00a0<\/p>\n<p>Canada\u2019s privacy laws must change<\/p>\n<p>While much of the announcement focused on OpenAI, regulators also stressed that significant changes are needed to Canadian privacy laws that recognize the realities of a rapidly changing technological landscape.\u00a0<\/p>\n<p>Canada\u2019s privacy legislation hasn\u2019t been meaningfully updated in more than 40 years; Ottawa announced this spring that it has launched a review of the Privacy Act with the intent of modernizing it. Canadians are also awaiting the launch of the country\u2019s AI strategy, which was initially slated for late 2025.<\/p>\n<p>\u201cThis investigation also further reinforces the need to modernize Canada\u2019s privacy laws for the digital age,\u201d Dufresne said. \u201cWhile current laws apply to AI, updated laws would help further support the safe deployment of new technologies to protect Canadians\u2019 fundamental right to privacy.\u201d<\/p>\n<p>\u201cThe methods companies are using \u2026 could never be carried out in ways that would meet the consent requirements of [Alberta\u2019s] PIPA.\u201d<\/p>\n<p>Diane McLeod,<br \/>Alberta privacy commissioner<\/p>\n<p>Specifically, commissioners cited the challenges that AI, and the internet broadly, pose in meeting consent requirements as currently legislated. Michael Harvey, the BC privacy commissioner, said he has written to BC\u2019s minister of citizen services to encourage modernization of its legislation.\u00a0<\/p>\n<p>\u201cWe\u2019re left at an impasse: on one hand, AI applications have potentially transformative benefits, but in certain cases, such as the one before us, applications are developed without adequate privacy,\u201d he said. \u201cOn the other hand, those privacy laws were written for a different era and are strained to the brink. Both companies and the law have to change.\u201d<\/p>\n<p>Alberta commissioner Diane McLeod echoed those sentiments, saying that legislation needed to confront the realities of the digital age. \u201c<\/p>\n<p>\u201cThe methods companies are using\u2014scraping data from publicly accessible websites\u2014could never be carried out in ways that would meet the consent requirements of [Alberta\u2019s] PIPA,\u201d she said. \u201cMy office has advocated for some time that changes be made to PIPA to allow for tech and innovation but still provide privacy safeguards.<\/p>\n<p>\u201cConsent-based protections, for example, may no longer be feasible in an age where technology companies have easy access to so much information about individuals on the internet. Other options must be found,\u201d she added.\u00a0In a statement issued Wednesday afternoon, Solomon mirrored the regulators\u2019 comments, saying the report\u2019s findings underscored \u201cthe importance of protecting Canadians\u2019 personal information in the age of AI.\u201d He added that modernizing Canada\u2019s privacy laws \u201cremains a priority\u201d for the federal government.<\/p>\n<p>BetaKit\u2019s Prairies reporting is <a href=\"https:\/\/betakit.com\/betakit-to-open-full-time-prairies-bureau\/\" target=\"_blank\" rel=\"noopener nofollow\" title=\"\">funded in part by YEGAF<\/a>, a not-for-profit dedicated to amplifying business stories in Alberta.<\/p>\n<p>Feature image courtesy TechCrunch. Licensed under Creative Commons Attribution 2.0 Generic (CC BY 2.0).<\/p>\n","protected":false},"excerpt":{"rendered":"Privacy commissioners say OpenAI broke the rules by scraping Canadian data to develop and train ChatGPT. &#13; Commissioners&hellip;\n","protected":false},"author":2,"featured_media":34909,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[164,239,17,3841,61,526,241],"class_list":{"0":"post-34908","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-canada","8":"tag-alberta","9":"tag-british-columbia","10":"tag-canada","11":"tag-govt","12":"tag-ottawa","13":"tag-prairies","14":"tag-quebec"},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/canada\/wp-json\/wp\/v2\/posts\/34908","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/canada\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/canada\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/canada\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/canada\/wp-json\/wp\/v2\/comments?post=34908"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/canada\/wp-json\/wp\/v2\/posts\/34908\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/canada\/wp-json\/wp\/v2\/media\/34909"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/canada\/wp-json\/wp\/v2\/media?parent=34908"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/canada\/wp-json\/wp\/v2\/categories?post=34908"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/canada\/wp-json\/wp\/v2\/tags?post=34908"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}