
Erstellen sexuell eindeutiger Deepfake-Bilder, die in Großbritannien als Straftat gelten sollen | Deepfake
https://www.theguardian.com/technology/2024/apr/16/creating-sexually-explicit-deepfake-images-to-be-made-offence-in-uk

Erstellen sexuell eindeutiger Deepfake-Bilder, die in Großbritannien als Straftat gelten sollen | Deepfake
https://www.theguardian.com/technology/2024/apr/16/creating-sexually-explicit-deepfake-images-to-be-made-offence-in-uk
12 comments
I think deepfakes, along with a myriad of other techniques, are used by scumbags to put down opinions and people.
That being said, I don’t see how this law will stop any of that.
Maybe politicians should push for critical thinking and media literacy at a younger age. But that’ll only make their own jobs harder wouldn’t it, having a populace that is skilled at discerning fact from fiction?
It sounds like this would almost make it illegal to use software like stable diffusion. At least in my experience you get sexual explicit images whether you like it or not.
I’d get selling or sharing deepfaked images being illegal.
But this bans possession too, without the intent to share, as the image itself is a violation of privacy.
They going to ban sexual thoughts of privileged people next?
Like they tried stopping illegal downloads, haha good luck.
Problem with this law is that the requirements for a successful prosecution are absurd.
[From the BBC](https://www.bbc.co.uk/news/uk-68823042):
>The new law will make it an offence for someone to create a sexually explicit deepfake – even if they have no intention to share it but “purely want to cause alarm, humiliation, or distress to the victim”, the MoJ said.
>Clare McGlynn, a law professor at Durham University who specialises in legal regulation of pornography and online abuse, told the Today programme the legislation has some limitations.
>She said it “will only criminalise where you can prove a person created the image with the intention to cause distress”, and this could create loopholes in the law.
So first the creator of the deepfake has to be found, which in the case cited in that BBC story has already proven difficult, and then the prosecution has to prove the perp intended to cause “alarm, humiliation, or distress”.
Creating explicit deepfakes that then get out into the world should simply be a strict liability offence.
This is yet another idea from a government that can’t find its legislative arse with both hands.
[deleted]
Must be great for the UK when your Minister for victims and safeguarding is suggesting laws to combat “misogynistic deepfakes”, meanwhile the law still doesn’t recognise that you can actually rape a man in a heterosexual relationship.
Progressives are some of the worst Western sexists.
Folks posting just off the headline. 😉 The law isn’t targeted at all AI created porn, just images of real people without their consent. That shouldn’t be controversial it’s basically existing law making sure a new technology isn’t treated as a loophole.
> “It is another example of ways in which certain people seek to degrade and dehumanise others – especially women. And it has the capacity to cause catastrophic consequences if the material is shared more widely. This government will not tolerate it.
Legally speaking, at what point does an image become a deep fake? Isn’t the defining characteristic of a deep fake that it is indistinguishable from a real photo or video? The majority of AI generated images are very clearly not real. So what does this law actually target?
But is it ok if I make them privately and never distribute them? Asking for a horny friend…
Don’t understand why this is necessary.
Not because I don’t agree with it, but because I don’t think deep fakes of real people should be legal at all (without their consent).
How is it OK to make a deep fake of a colleague at work beating up a child? Just because it isn’t porn?
Where does one find these deepfakes? Seems like such a non issue.