{"id":291232,"date":"2025-10-10T06:12:18","date_gmt":"2025-10-10T06:12:18","guid":{"rendered":"https:\/\/www.europesays.com\/us\/291232\/"},"modified":"2025-10-10T06:12:18","modified_gmt":"2025-10-10T06:12:18","slug":"sag-sftra-latest-hollywood-player-to-come-out-swinging-against-ai-sora","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/us\/291232\/","title":{"rendered":"SAG-SFTRA Latest Hollywood Player To Come Out Swinging Against AI &#038; Sora"},"content":{"rendered":"<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\t<strong>UPDATED with SAG-AFTRA statement: <\/strong>SAG-AFTRA has followed <a href=\"https:\/\/deadline.com\/tag\/uta\/\" id=\"auto-tag_uta\" data-tag=\"uta\" rel=\"nofollow noopener\" target=\"_blank\">UTA<\/a> , CAA and the MPA in <a href=\"https:\/\/deadline.com\/2025\/10\/sora-2-hollywood-ai-sam-altman-1236572662\/\" rel=\"nofollow noopener\" target=\"_blank\">sounding the alarm at Sora 2<\/a> , the newest version of <a href=\"https:\/\/deadline.com\/tag\/open-ai\/\" id=\"auto-tag_open-ai\" data-tag=\"open-ai\" rel=\"nofollow noopener\" target=\"_blank\">Open AI<\/a>\u2019s video-generating app. It\u2019s the latest, and most threatening to Hollywood, version of Sora. <\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\tSAG-AFTRA President Sean Astin and National Executive Director &amp; Chief Negotiator Duncan Crabtree-Ireland insisted in a joint statement on Thursday that art is about connection and performance, not simulation. <\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\t\u201cThe world must be reminded that what moves us isn\u2019t synthetic. It\u2019s human,\u201d they wrote.<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\tThe duo called out the tech companies and the media for creating \u201ca sensationalized narrative, designed to manipulate the public and make space for continued exploitation.\u201d<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\tSpecifically, with the focus on Tilly Norwood, they say, news outlets anthropomorphize a batch of code and \u201ctease the story of a more realistic-looking artificial creation as an entertainment industry \u2018breakthrough\u2019 or breathlessly stoke a non-existent \u2018star signing\u2019 competition among agencies.\u201d<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\tPoliticians come in for some blame for not regulating Artificial Intelligence and protecting creators.<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\t\u201cTilly is not the threat, the real danger comes from an unregulated environment that can only flourish by stealing digital information from artists and companies and using it without ethics or respect. This story of creating synthetic characters is not about novelty. It\u2019s about authorship.\u201d<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\tSome argue that, with AI, the author is be the human delivering prompts to the software. Astin and Crabtree-Ireland acknowledge those human inputs, but call such a process an \u201cinsult\u201d to artistry.<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\t\u201cYes, there is human effort in assembling synthetic imagery or voices like Tilly Norwood,\u201d they write. \u201cBut that process undermines the very ecosystem that makes storytelling possible. It insults the artistry of our performers, assaults our business, and threatens the legacy our members\u2019 work creates, in many cases built over generations.\u201d<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\tOf OpenAI\u2019s controversial <a href=\"https:\/\/deadline.com\/2025\/10\/mpa-sora-2-openai-copyright-infringement-1236571849\/\" data-type=\"post\" data-id=\"1236571849\" rel=\"nofollow noopener\" target=\"_blank\">\u201cOpt-out\u201d policy<\/a>, they write, \u201cOpt-out isn\u2019t consent.\u201d<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\tThe duo is more sanguine about another aspect of <a href=\"https:\/\/deadline.com\/tag\/sora-2\/\" id=\"auto-tag_sora-2\" data-tag=\"sora-2\" rel=\"nofollow noopener\" target=\"_blank\">Sora 2<\/a>.<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\t\u201cSora 2\u2019s approach to image, likeness and voice replication through its \u2018cameo\u2019 function deserves recognition. This feature allows you to create a digital replica of yourself and control its use within Sora 2, including whether others can access it. Critically, this approach is opt-in, which makes all the difference. While the controls and details remain imperfect, they incorporate core principles of informed consent and implement them systematically. This reflects months of dialogue between SAG-AFTRA and OpenAI, along with the dedicated work of our A.I. Taskforce and staff. We hope more A.I. companies will follow OpenAI\u2019s lead in this respect.\u201d<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\tFinally, they cite three principles guiding their efforts.<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\t<strong>1.) Performance must remain human-centered.<\/strong><\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\t<strong>2.) A.I. can enhance creativity, but it must never replace it.<\/strong><\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\t<strong>3.) A.I. use must be transparent, consensual, and compensated. <\/strong><\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\tSee their joint statement in full at the bottom of this post.<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\tEarlier today, UTA weighed in.<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\t\u201cThere is no substitute for human talent in our business, and we will continue to fight tirelessly for our clients to ensure that they are protected. When it comes to OpenAI\u2019s Sora or any other platform that seeks to profit from our clients\u2019 intellectual property and likeness, we stand with artists,\u201d the agency said.<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\t\u201cThe future of industries based on creative expression and artistry relies on controls,\u00a0protections, and rightful compensation. The use of such property without consent, credit or compensation is exploitation, not innovation.\u201d\u00a0<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\tYesterday, <a href=\"https:\/\/deadline.com\/2025\/10\/caa-skeptical-open-ai-sora-hollywood-strategy-1236574061\/\" rel=\"nofollow noopener\" target=\"_blank\">CAA weighed in<\/a>, saying it \u201cis unwavering in our commitment to protect our clients and the integrity of their creations. The misuse of new technologies carries consequences that reach far beyond entertainment and media, posing serious and harmful risks to individuals, businesses, and societies globally. It is clear that OpenAI\/<a href=\"https:\/\/deadline.com\/tag\/sora\/\" rel=\"nofollow noopener\" target=\"_blank\">Sora<\/a>\u00a0exposes our clients and their intellectual property to significant risk.\u201d<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\tAnd earlier this week, the MPA chief <a href=\"https:\/\/deadline.com\/2025\/10\/mpa-sora-2-openai-copyright-infringement-1236571849\/\" rel=\"nofollow noopener\" target=\"_blank\">Charles Rivkin blasted the app<\/a>, noting that \u201cSince Sora 2\u2019s release, videos that infringe our members\u2019 films, shows, and characters have proliferated on OpenAI\u2019s service and across social media.\u201d\u00a0<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\tThe Sora generative video model allows users to create social-media-ready videos with just a brief text prompt. The result can be a product of the user\u2019s imagination, or a fan-fiction-like story using recognizable properties.<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\tIn response to the furor from content creators over the unauthorized use of their works, OpenAI\u2019s Sam Altman walked back the company\u2019s initial approach of having IP owners opt out of having their stuff fed into the model. He wrote last week that \u201cwe will give rightsholders more granular control over generation of characters, similar to the opt-in model for likeness but with additional controls.\u201d <\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\tHe also held out the possibility of remuneration down the line. But it\u2019s still vague. \u201cWe are going to have to somehow make money for video generation \u2026 We are going to try sharing some of this revenue with rightsholders who want their characters generated by users. The exact model will take some trial and error to figure out, but we plan to start very soon. Our hope is that the new kind of engagement is even more valuable than the revenue share, but of course we want both to be valuable.\u201d<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\tHollywood is skeptical and while studios haven\u2019t sued Open AI yet they have been testing the waters with smaller companies from San Francisco-based Midjourney to Chinese AI firm MiniMax\u00a0to various cease and desist letters.<\/p>\n<p class=\"paragraph larva \/\/ lrv-u-margin-lr-auto     \">\n\tHere is the statement from SAG-AFTRA\u2019s Astin and Crabtree-Ireland:<\/p>\n<blockquote class=\"pullquote pullquote-deadline larva \/\/  \">\n<p>A.I. developments are in the headlines. Here\u2019s what SAG-AFTRA is doing.<\/p>\n<p>\u00a0Last week, dramatic headlines and a crescendo of news stories broke regarding an A.I.-generated synthetic character called \u201cTilly Norwood\u201d\u00a0and the release of Sora 2 by OpenAI. These developments reignited debate about how artificial intelligence is reshaping the film and television industry. The confluence of these stories has broken through the relentless stream of news and information on this subject and focused our collective curiosity and anxiety about the future of creativity itself at this moment. It\u2019s an opportune time to demonstrate how our union is fighting to protect our interests and speak in defense of what it means to be a performer.<\/p>\n<p>Let\u2019s be clear:\u00a0Tilly Norwood is not a person.\u00a0It\u2019s a synthetic construct generated by software trained on the work of countless professional performers, real human beings, whose work was taken without permission, without credit and without compensation. When news outlets tease the story of a more realistic-looking artificial creation as an entertainment industry \u201cbreakthrough\u201d or breathlessly stoke a non-existent \u201cstar signing\u201d competition among agencies, it misses the fundamental truth: Tilly is not the threat, the real danger comes from an unregulated environment that can only flourish by stealing digital information from artists and companies and using it without ethics or respect. This story of creating synthetic characters is not about novelty. It\u2019s about authorship, consent and the value of human artistry. Anthropomorphizing the synthetic fake, giving it a memorable name and playing up the character\u2019s representation of beauty is not objective or authentic; it is a distraction, a misdirection from what is actually taking place. What has been created is a sensationalized narrative, designed to manipulate the public and make space for continued exploitation.<\/p>\n<p>The public release of Sora 2 and its remarkably advanced capabilities excited some observers. For many more of us, this lightning-fast technological evolution brings profound concern. OpenAI\u2019s decision to honor copyright only through an \u201copt-out\u201d model threatens the economic foundation of our entire industry and underscores the stakes in the litigation currently working through the courts. If A.I. companies can shift the burden to rightsholders to opt out, what does copyright really mean? Opt-out isn\u2019t consent \u2014 let alone informed consent. That\u2019s why SAG-AFTRA fights for opt-in approaches. No one\u2019s creative work, image, likeness or voice should be used without affirmative, informed consent. Anything less is an unjustifiable violation of our rights.<\/p>\n<p>That said, Sora 2\u2019s approach to image, likeness and voice replication through its \u201ccameo\u201d function deserves recognition. This feature allows you to create a digital replica of yourself and control its use within Sora 2, including whether others can access it. Critically, this approach is opt-in, which makes all the difference. While the controls and details remain imperfect, they incorporate core principles of informed consent and implement them systematically. This reflects months of dialogue between SAG-AFTRA and OpenAI, along with the dedicated work of our A.I. Taskforce and staff. We hope more A.I. companies will follow OpenAI\u2019s lead in this respect.<\/p>\n<p>But fundamentally, we must remember and remind everyone that audiences don\u2019t build emotional connections or lifelong relationships with algorithms. They connect with artists. They see themselves reflected in real human performances, in the joy, heartbreak, jealousy, love, resilience and truth that only a person can express. From Homer and Shakespeare to today\u2019s storytellers, performance has always been a mirror of our shared humanity. No dataset or generative model can capture that spark. A.I. is getting more realistic looking, but audiences will always gravitate to that which is real and true. They want to know that the artist really feels what they are feeling, in a laugh, a sob, a smile or a tear. When they are moved, they want to know that what they are feeling is real. They want proof. The intent of the artist is what makes it real. If you scrape, feed or otherwise deconstruct the love or anger into a trillion component parts and then reassemble them to be manipulated and repurposed without telling the audience exactly where it came from, you are cheapening their experience and unmooring their reality.\u00a0<\/p>\n<p>Yes, there is human effort in assembling synthetic imagery or voices like Tilly Norwood.\u00a0But that process undermines the very ecosystem that makes storytelling possible. It insults the artistry of our performers, assaults our business, and threatens the legacy our members\u2019 work creates, in many cases built over generations.<\/p>\n<p>That\u2019s why SAG-AFTRA has fought and will continue to fight for strong, enforceable protections.\u00a0Throughout labor history, when new technologies emerge and are adopted by business, workers are disadvantaged and unions must fight to protect them.\u00a0<\/p>\n<p>In 2017, your union leadership formally identified this threat and began working passionately, creatively and unceasingly to combat it. Many of our members don\u2019t know that we have partnered with policymakers, in some cases even drafting A.I. protection legislation. We have been navigating the complexities of intellectual property law and more, because the power of our contracts must be amplified by the law, and so far, that law barely exists.\u00a0<\/p>\n<p><strong>The protections your union has already secured:<\/strong><\/p>\n<p>In the 2023 strike, SAG-AFTRA won its first-ever, enforceable protections around artificial intelligence. Employers must obtain clear, informed consent before creating or using a digital replica. They must pay fairly for that use. They cannot reuse a scan or a performance indefinitely without new bargaining and compensation.\u00a0And they cannot deploy synthetic performers in covered film, television and streaming projects, except under strict conditions.<\/p>\n<p>Our contracts across film, television, commercials and other areas generally require notice and bargaining when synthetic performers are going to be used.<\/p>\n<p>If a synthetic character is created by prompting a generative A.I. system with a performer\u2019s name (with the addition, in some cases, of a major facial feature), the performer\u2019s consent is required.\u00a0<\/p>\n<p><strong>What the law does not yet protect:<\/strong><\/p>\n<p>Our contracts bind only signatory employers. They can\u2019t stop A.I. developers from scraping performances off the internet or from training models on decades of film and television without permission. That is why SAG-AFTRA has been leading the fight for stronger laws:<\/p>\n<p>The\u00a0<strong>No FAKES Act<\/strong>\u00a0would prohibit unauthorized digital replicas.<\/p>\n<p>The\u00a0<strong>TRAIN Act<\/strong>\u00a0would require transparency in training datasets.<\/p>\n<p>The\u00a0<strong>A.I. Accountability &amp; Data Protection Act<\/strong>\u00a0would create a federal tort for unauthorized use of biometric data.<\/p>\n<p>And in California and New York, we\u2019ve championed legislation requiring transparency and disclosure when synthetic performers are used.<\/p>\n<p><strong>Where we go from here:<\/strong><\/p>\n<p>We know that some companies will continue to push the limits, marketing \u201csynthetic performers\u201d as the next big thing. And while tools like Sora 2 generate awe for their technical prowess, audiences continue to show that what moves them most is not simulation, it\u2019s sincerity. The real connection happens only when a living performer brings a story to life.\u00a0<\/p>\n<p>Our commitment is simple and our position is unwavering:<\/p>\n<p>Performance must remain human-centered.<\/p>\n<p>A.I. can enhance creativity, but it must never replace it.<\/p>\n<p>A.I. use must be transparent, consensual, and compensated.<\/p>\n<p>We invite you to learn more about your protections and our ongoing advocacy at\u00a0sagaftra.org\/ai.<\/p>\n<p>This moment is noisy, but it\u2019s also clarifying. The world must be reminded that what moves us isn\u2019t synthetic. It\u2019s human. And as long as SAG-AFTRA exists, that humanity will be defended.<\/p>\n<p>With respect and resolve,<\/p>\n<p>Sean Astin<br \/>President<\/p>\n<p>Duncan Crabtree-Ireland<br \/>National Executive Director &amp; Chief Negotiator<\/p>\n<\/blockquote>\n","protected":false},"excerpt":{"rendered":"UPDATED with SAG-AFTRA statement: SAG-AFTRA has followed UTA , CAA and the MPA in sounding the alarm at&hellip;\n","protected":false},"author":3,"featured_media":291233,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[21],"tags":[691,738,3226,142367,149317,158,67,132,68,10080],"class_list":{"0":"post-291232","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-open-ai","11":"tag-sora","12":"tag-sora-2","13":"tag-technology","14":"tag-united-states","15":"tag-unitedstates","16":"tag-us","17":"tag-uta"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@us\/115348396317632406","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts\/291232","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/comments?post=291232"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts\/291232\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/media\/291233"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/media?parent=291232"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/categories?post=291232"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/tags?post=291232"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}