
Creating sexually explicit deepfakes to become a criminal offence
https://www.bbc.co.uk/news/uk-68823042
by OneArmJack

Creating sexually explicit deepfakes to become a criminal offence
https://www.bbc.co.uk/news/uk-68823042
by OneArmJack
11 comments
It’s a nice idea because I can see how somebody creating fake porn images of you would be absolutely awful, but there’s something here that reads contradictory to me.
>Under the legislation, anyone making explicit images of an adult without their consent will face a criminal record and unlimited fine.
Fine, makes sense.
>Under the Online Safety Act, which was passed last year, the sharing of deepfakes was made illegal.
Yes, this also makes sense.
>The new law will make it an offence for someone to create a sexually explicit deepfake – even if they have no intention to share it but “purely want to cause alarm, humiliation, or distress to the victim”
So by the definition of the new law, it would only apply to those images that haven’t or won’t been shared, because a law already exists which covers that. The crux of being guilty of the new offence appears to be that you will be guilty if you create them only if you “purely want to cause alarm, humiliation, or distress to the victim”.
How are you supposed to cause alarm, humiliation or distress to the victim if you DON’T share them? If you create them and they remain only on your computer for personal ‘use’, and you don’t show anybody else, how could this alarm and distress be caused? I suppose you could create them and verbally tell the victim that you had, but you wouldn’t even need to create a picture to achieve that effect so if you didn’t create one then you still wouldn’t be guilty by the definition.
If somebody has seen something I’ve missed here then I’m all ears.
Call me crazy but I took all my kids pictures off social media recently because of deep fakes.
I never had lots anyway. I just didn’t expect this to be a worry I would ever have.
Maybe because even stored on your own devices with no intention to share, it doesn’t rule out the fact that it’s still possible for those pictures to be viewed or obtained by other people, even if you also didn’t intend on showing anyone else at all.
I’m also not sure showing people a picture on your own phone counts as “sharing” legally so could possibly link to that too.
It doesn’t really explain very well does it so who knows lol, but there must be something that allows the law to act on a broader variety of problems relating to deep fakes.
I’m not sure why my mind went there, but how does this law interact with the dead?
For example, if someone made interracial gay porn of Hitler to troll neo-nazis have they committed a crime because Ol’ Adolf might have had his feelings hurt if he was still alive?
I’d hope there is some sort of exception in the law for situations like this, I don’t want to see someone go to jail for making fun of Hitler.
I’m pretty sure most of the questions people have raised in this post will have been covered in the details of the legislation.
Reasonable and necessary, but we know it’ll be hard to enforce. The best we can do, as a society, is clearly view it as morally reprehensible – leaking somebody’s nudes use to not be deemed as deplorable.
So I cant even alarm myself with my own deep fakes now? Thanks a lot
Remember when someone depicted Tom Cruise as gay in a (possibly fictional) story in a celebrity magazine? Cruise was awarded millions when someone “defamed” him by saying they’d had a gay sex etc.: https://www.theguardian.com/world/2003/jan/16/film.filmnews Cruise had mentioned that he was “humiliated” by the story.
Would Cruise have won that case today?
If someone like Cruise argued that some sexual depiction of them, like that story, “defamed” them, or caused them “alarm, humiliation or distress”, would they be likely to win that case today?
Or would we, today, say that being depicted as gay isn’t defamation, and that an average person shouldn’t be alarmed or humiliated or distressed at a sexualised depiction of themselves, fictional or otherwise?
Is it possible that in a few years from now that this law would be seen in a similar way? That the average person shouldn’t be alarmed or humiliated or distressed at a sexualised depiction of themselves?
Is it better to encourage people to grow to accept the existence of these forms of depiction and expression, which are going to be *everywhere*? To encourage them to remove the power from these images by not caring about them?
Deepfakes are created by AI, and the AI must be trained, so if someone creates a deepfake of something illegal, child porn, rape, beastiality etc it must be trained on thousands of real images that are extremely illegal. So if a teenager takes a photo of classmate and deepfakes a sexualised image of them it’s not a minor issue, it’s one of the most serious crimes you can imagine.
Well, at least that’s some good news. Even though AI will be difficult to keep up with legislation-wise.
No more bedding Taylor Swift, every night inside the Occulus Rift..