That’s perfect. It should be legal. Making pornography of someone illegal is just a different scale of grey from say…making drawing muhammad illegal, etc.
I can already hire an artist to make me some porn of …I dunno…Obama or something. Why should that be illegal just because someone does it with AI instead?
Hate to break it to you, this is already legal. “Non Consensual Porn” only applies to photographs. Nobody should have to consent to everything like that.
If I draw you standing under the eiffel tower, fully clothed - the legality shouldn’t change just because you don’t LIKE what’s being drawn.
I’m aware it’s already legal, hence why action should be taken. plus videos are just a bunch of photos stitched together so I don’t see your point of it only applying to photos.
it’s not porn in general that should be illegal. ONLY pornography where the person has not explicitly said they would like to be in it. such as deepfake porn, or drawn where the person has also not said they would like to be in it.
Nobody should have to consent to everything like that
I’m sorry but holy fuck that is just morally bankrupt.
Someone should have the ABSOLUTE right to control any distribution of their image when of a sexual nature that they didn’t actively consent to being out there
Anything less is the facilitation of the culture of sexual abuse that lets the fappening or age of consent countdown clocks happen
Drawing a picture of someone under the eifel tower is a wildly different act than drawing them in the nude without them knowing and agreeing with full knowledge of what you plan to do with that nude piece.
Trying to pretend it’s not is feeding the culture of not listening to victims.
It’s like saying that cat calling is harmless, forcing people to be reminded they are seen as a sex object is well known and documented as a tool of keeping the victim “in their place.”
It’s harassment, and when done at the scale famous folks experience for the crime of being well known and also attractive, basically amounts to a campaign of terror via sexual objectification.
Nevermind how tolerating it makes space for even more focused acts of terror like doxxing and making threats of sexual assault.
Then you need to take a step back and look at your argument.
Producing the work isn’t the problem here. Distributing it and harassing people with it is.
So why don’t we just make distributing it as a form of harassment illegal instead? You deal with the specific thing that causes the problem, not the thing that it stems from broadly just because you don’t like nudity.
But if I want to sit here and make AI pictures of women and whack off to them in my bunk, fake women who might incidentally look like some real woman – Nobody should be penalized because of that. You’re painting with broad strokes of a brush here, without thinking of the larger repercussions.
What about twins? Who consents there? If one gives permission and the other doesn’t…then what? How do you handle edge cases like that? Because you’re trying now to put rules around something that’s awfully grey-area here.
You lose all those weird edge cases once you attack the real problem: Harassing people with sexual images. It’s not the nudes that’s the problem, it’s the harassment.
I’m wondering if the degree of believability of the image has, or should have any bearing on the answer here. Like, if a third party who was unaware of the image’s provenance came across it, might they be likely to believe the image is authentic or authorized?
For another angle, we allow protections on the usage of fictional characters/their images. Is it so wild to think that a real person might be worthy of the same protections?
Ultimately, people are going to be privately freaky how they’re gonna be privately freaky. It mostly only ever becomes a problem when it stops being private. I shouldn’t have to see that a bunch of strangers made porn to look like me, and neither should Taylor. And mine are unlikely to make it into tabloids.
A. The short answer is no. Individuals do not have an absolute ownership right in their names or likenesses. But the law does give individuals certain rights of “privacy” and “publicity” which provide limited rights to control how your name, likeness, or other identifying information is used under certain circumstances.
From that page, it actually looks like there is a very specific criteria for this - and Taylor Swift HERSELF is protected because she is a celebrity.
However, there are still a lot of gotchas. So instead of making the product/art itself illegal, using it as harassment should be what’s illegal. Attaching someone’s name to it in an attempt to defame them is what’s already illegal here.
I’d much rather that we do nothing, let it proliferate to the point where nobody trusts nudes at all any more.
but then it’s legal to create pornography of anybody, who consents or not. which well, isn’t exactly good. so it honestly should be made law.
That’s perfect. It should be legal. Making pornography of someone illegal is just a different scale of grey from say…making drawing muhammad illegal, etc.
I can already hire an artist to make me some porn of …I dunno…Obama or something. Why should that be illegal just because someone does it with AI instead?
but then they wouldn’t have consented to the creation of porn of themselves. which if it is a deepfake, it is literally non consensual porn.
Hate to break it to you, this is already legal. “Non Consensual Porn” only applies to photographs. Nobody should have to consent to everything like that.
If I draw you standing under the eiffel tower, fully clothed - the legality shouldn’t change just because you don’t LIKE what’s being drawn.
I’m aware it’s already legal, hence why action should be taken. plus videos are just a bunch of photos stitched together so I don’t see your point of it only applying to photos.
Because it being nude/etc is the only thing that is different from people just simply drawing others in art.
Just because you don’t like pornography, shouldn’t change the legality of it. It’s prudism and puritanism at its finest.
it’s not porn in general that should be illegal. ONLY pornography where the person has not explicitly said they would like to be in it. such as deepfake porn, or drawn where the person has also not said they would like to be in it.
If I draw a nude stick figure with two perfect circle tits and say its Taylor Swift, would I have broken the law?
I’m sorry but holy fuck that is just morally bankrupt.
Someone should have the ABSOLUTE right to control any distribution of their image when of a sexual nature that they didn’t actively consent to being out there
Anything less is the facilitation of the culture of sexual abuse that lets the fappening or age of consent countdown clocks happen
Drawing a picture of someone under the eifel tower is a wildly different act than drawing them in the nude without them knowing and agreeing with full knowledge of what you plan to do with that nude piece.
Calling this sexual abuse is absolutely insulting and disgusting
Trying to pretend it’s not is feeding the culture of not listening to victims.
It’s like saying that cat calling is harmless, forcing people to be reminded they are seen as a sex object is well known and documented as a tool of keeping the victim “in their place.”
It’s harassment, and when done at the scale famous folks experience for the crime of being well known and also attractive, basically amounts to a campaign of terror via sexual objectification.
Nevermind how tolerating it makes space for even more focused acts of terror like doxxing and making threats of sexual assault.
Then you need to take a step back and look at your argument.
Producing the work isn’t the problem here. Distributing it and harassing people with it is.
So why don’t we just make distributing it as a form of harassment illegal instead? You deal with the specific thing that causes the problem, not the thing that it stems from broadly just because you don’t like nudity.
But if I want to sit here and make AI pictures of women and whack off to them in my bunk, fake women who might incidentally look like some real woman – Nobody should be penalized because of that. You’re painting with broad strokes of a brush here, without thinking of the larger repercussions.
What about twins? Who consents there? If one gives permission and the other doesn’t…then what? How do you handle edge cases like that? Because you’re trying now to put rules around something that’s awfully grey-area here.
You lose all those weird edge cases once you attack the real problem: Harassing people with sexual images. It’s not the nudes that’s the problem, it’s the harassment.
No, it’s insulting to actual victims of actual events that happen in real life
I’m wondering if the degree of believability of the image has, or should have any bearing on the answer here. Like, if a third party who was unaware of the image’s provenance came across it, might they be likely to believe the image is authentic or authorized?
For another angle, we allow protections on the usage of fictional characters/their images. Is it so wild to think that a real person might be worthy of the same protections?
Ultimately, people are going to be privately freaky how they’re gonna be privately freaky. It mostly only ever becomes a problem when it stops being private. I shouldn’t have to see that a bunch of strangers made porn to look like me, and neither should Taylor. And mine are unlikely to make it into tabloids.
From https://www.owe.com/resources/legalities/7-issues-regarding-use-someones-likeness/
From that page, it actually looks like there is a very specific criteria for this - and Taylor Swift HERSELF is protected because she is a celebrity.
However, there are still a lot of gotchas. So instead of making the product/art itself illegal, using it as harassment should be what’s illegal. Attaching someone’s name to it in an attempt to defame them is what’s already illegal here.
Having an image exist somewhere of them isn’t the sort of thing a person should have to consent to.
Consent is for things that affect that person.
I really hope your gf makes you watch ai porn of her with your best friend.
You really need healthier relationships in your life I think; my wife would have no reason to do such a thing.