Follow Slashdot blog updates by subscribing to our blog RSS feed
Nickname:
Password:
Nickname:
Password:
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
Too many people are profiting from these claims. Hence you would have to reform a major part of law-enforcement, politics and some part of the IT security industry.Gay porno As to whether that is doable, refer to the „War on drugs“. Too many people are profiting from these claims.Profit? Man the models required to do face swap / body impersonation are openly published and there’s plenty of open source projects that will let you play around with the concept and I doubt any of them have a magic – „fail if I see a titty“ component in their models. Too many people are profiting from these claims. Profit? Man the models required to do face swap / body impersonation are openly published and there’s plenty of open source projects that will let you play around with the concept and I doubt any of them have a magic – „fail if I see a titty“ component in their models. And if you could read, you would have understood that I was referring to something else. Not a problem. We’ll just make a deepfake of your wife being railed by a pig. Oh wait. You don’t have a wife.What ever you are into, if you get enjoyment out of it then its up to you.If you never distributed it how would I know? How would it affect me or my wife in any way. Right now if someone sees anyone they can imagine them doing any sexual act they want.If it gets distributed as long as its clearly labeled as a deep fake I also don’t care. It would probably be lost in the sea of porn anyway I probably wouldn’t know either.If it is portrayed as real then that is fraud an character assignation, however that is Not a problem. We’ll just make a deepfake of your wife being railed by a pig. Oh wait. You don’t have a wife. What ever you are into, if you get enjoyment out of it then its up to you. If you never distributed it how would I know? How would it affect me or my wife in any way. Right now if someone sees anyone they can imagine them doing any sexual act they want. If it gets distributed as long as its clearly labeled as a deep fake I also don’t care. It would probably be lost in the sea of porn anyway I probably wouldn’t know either. If it is portrayed as real then that is fraud an character assignation, however that is There are websites advertising this service for the purposes of blackmailing women into doing sexual favours. Imagine if your daughter was bullied with deepfake images. Imagine if your wife lost her job because someone sent deepfakes to her employer. These are things that are actually happening already. If you never found out, okay. But by making this a cheap and easily available service, with results good enough to pass cursory examination, especially if they are resized and heavily JPEG compressed to hide f Maybe the DOJ is going after them. That doesn’t mean that the people producing the AI in the first place can’t agree to block this kind of thing. They already do it for so many other things. Yes. You do not have permission to use their likeness. Not hard to understand.Actually, it is hard to understand since it’s currently legal to use someone’s likeness.If Taylor Swift walks down the street, I can legally photograph her.I can’t sell the photo for commercial use in advertising or product endorsement, but I can sell it to a newspaper for editorial use, or post it to my Facebook page.It is also legal for me to edit the photo, rearrange the pixels, or use AI to create an image that looks exactly like her. I can’t use it for advertising, but I can post it publicly.So what, e Yes. You do not have permission to use their likeness. Not hard to understand. Actually, it is hard to understand since it’s currently legal to use someone’s likeness. If Taylor Swift walks down the street, I can legally photograph her. I can’t sell the photo for commercial use in advertising or product endorsement, but I can sell it to a newspaper for editorial use, or post it to my Facebook page. It is also legal for me to edit the photo, rearrange the pixels, or use AI to create an image that looks exactly like her. I can’t use it for advertising, but I can post it publicly. So what, e Yes. You do not have permission to use their likeness. Not hard to understand.So if I sleep and have a dream where I see someone, I neither have permission to use their likeness. Jail time. Yes. You do not have permission to use their likeness. Not hard to understand. So if I sleep and have a dream where I see someone, I neither have permission to use their likeness. Jail time. Yes. You do not have permission to use their likeness.Correct, and there are laws against this already.
Should all image editors have AI built in to stop any unauthorized likeness use? Yes. You do not have permission to use their likeness. Correct, and there are laws against this already.
Should all image editors have AI built in to stop any unauthorized likeness use? Your freedom to swing your arms ends at my nose. Which means distributing a malicious deepfake should be a crime, which actually is probably already covered under numerous other laws. Actually creating it for personal use if nobody ever finds out? Banning that is getting creepily close to thoughtcrime laws. All they are doing is asking them not to offer this as a service. Not even banning offering that service. You made a huge leap to it being *illegal* to even create them for „personal use“. Clearly AI deepfake nudes are not the same as just drawing them or photoshopping them yourself. We didn’t use to see an epidemic of children using those things to abuse classmates, or people advertising to photoshop photos for the purpose of sexual blackmail. Asking companies to not help criminals doesn’t seem like anything „So now“? It’s always been that way. It’s just that it was harder to do and the results often were not very good, and many times the victim never found out anyway. Now there are companies offering this as a service using AI, no skill required. Children are using it to bully each other, it is so accessible. It doesn’t seem at all unreasonable to ask AI companies to try to limit this kind of abuse. Even Photoshop has restrictions on things like editing and printing images of dollar bills. This doesn’t affect yo Even Photoshop has restrictions on things like editing and printing images of dollar bills.
I know Xerox type machines have prevention measures for money…but this Photoshop thing must be relatively new.
There were no such boundaries I knew of when I used PS…that was before they went to the „rental“ model.
But you could manipulate any images you pleased as far as I know prior to that. I don’t believe any other of the competitive tools prevent this, like Affinity Photo or GIMP…. Bank note detected was added to the first version of Photoshop CS, which was released in 2003. Colour photocopiers had it even before that. So now I’m abusing someone if I so much as have my own computer make a fake with their face on it? This has gone several steps too far. How to reacquire liberty?Good question. The answers might not be simple.
A „White House“* that gets „voluntary“ agreements to censor everything just might not be the answer? So now I’m abusing someone if I so much as have my own computer make a fake with their face on it? This has gone several steps too far. How to reacquire liberty? Good question. The answers might not be simple.
A „White House“* that gets „voluntary“ agreements to censor everything just might not be the answer? A „White House“* that gets „voluntary“ agreements to censor everything just might not be the answer?
It’s nothing new for them to try…ie: Trying to stop „misinformation“ at Facebook and Twitter a few years ago for instance.
We need to fight this tooth and nail…this is NOT something the govt. is there for…. There’s a reason strip clubs are called gentleman’s clubs. These sorts of agreements have worked out so well for the public in recent history. I’m sure there’s no way they would renege on such a binding commitment, once the government commits to not legislating.Those commitments don’t work when there’s a strong motivation to break them, that’s not that case here. This is more a „we commit not to do something incredibly controversial for virtually zero benefit“.Even Grok isn’t interested in pron, despite letting through virtually everything else go [acs.org.au].I’m not sure why Apple, Amazon, Google and Meta weren’t in the list, though I’m guessing it’s more to do with internal red tape than wanting to make Deepfake porn. These sorts of agreements have worked out so well for the public in recent history. I’m sure there’s no way they would renege on such a binding commitment, once the government commits to not legislating. Those commitments don’t work when there’s a strong motivation to break them, that’s not that case here. This is more a „we commit not to do something incredibly controversial for virtually zero benefit“. Even Grok isn’t interested in pron, despite letting through virtually everything else go [acs.org.au]. I’m not sure why Apple, Amazon, Google and Meta weren’t in the list, though I’m guessing it’s more to do with internal red tape than wanting to make Deepfake porn. This is more a „we commit not to do something incredibly controversial for virtually zero benefit“.This is pointless. The underlying models don’t make decisions about how they are used. You can’t train a simple face swap model without someone throwing it in an open source wrapper innocently, and then some idiot on 4chan using it to copy taylor swift’s face on the body of a porn star.That’s not how this works. AI models can’t be restricted from doing things without damage of the underlying model. At best companies can lock down their models, but there’s already plenty of publicly published models out ther This is more a „we commit not to do something incredibly controversial for virtually zero benefit“. This is pointless. The underlying models don’t make decisions about how they are used. You can’t train a simple face swap model without someone throwing it in an open source wrapper innocently, and then some idiot on 4chan using it to copy taylor swift’s face on the body of a porn star. That’s not how this works. AI models can’t be restricted from doing things without damage of the underlying model. At best companies can lock down their models, but there’s already plenty of publicly published models out ther What agreements could possibly work here? The models for detecting and manipulating faces are openly published and are just models, they can’t make decisions about the content they are being used for. The tools required to use these models exist in the open source world. Closing the barn door doesn’t help when you have no horses left. Won’t this make it easier, if not less expensive, to use real people? Does this mean that the government thinks that only real people should be used to make pornography? What makes anybody think they’ll expend too much effort to prevent what could well be the killer app for the AI industry? Granted, I’ll be the first to admit I’ve never tried to find an AI porn generator, but it certainly doesn’t seem like it’s as easy as running a Google search. I’ve already run headfirst into the „responsible AI guidelines“ on one of the major ones trying to make some silly cartoon artwork for AI sung songs, so I doubt any of the mainstream AI stuff will let you make porn. Heck, they don’t even let you make political cartoons of well-known celebs without complaining that you’re being too naughty. By comparison, doing the ol‘ „head switcheroo“ on Photoshop is brain dead easy these days. Headshot swapping in Photoshop is not trivial. You need two images where the heads are in mostly the same position, and things like skin tones mostly match. Hair has to be right, or at least easy to remove. Even then, it takes some skill to do convincingly. Harassment on social media is now pretty widespread. Someone posts a normal photo of themselves, and others respond with deepfakes of them. They are trivial to generate, it takes seconds and costs almost nothing. The sites are not difficult to find, and s Headshot swapping in Photoshop is not trivial. You need two images where the heads are in mostly the same position, and things like skin tones mostly match. Hair has to be right, or at least easy to remove. Even then, it takes some skill to do convincingly.
Anyone with a modicum of experience can headswap in PS quite easily and do a good job of it.
Blending tones and colors is basic stuff for editing photos….
How legitimate are the services? I don’t know. I uploaded a photo of myself and about half of the services errored out. The other half stated that they had made something great and that I could enter my credit card number to see it. Needless to say I did not pr I’m trying to imagine how children would come to understand the human body if they didn’t have their own to look at as a baseline, and only had the sanitized public education curriculum a source.
Simple: They wait for the stork to show up with their kid after coming into direct contact with someone else. While feeling really bad about it and fearing that they will soon be in hell. Thankfully, they all died out due to not knowing how to reproduce. I’m trying to imagine how children would come to understand the human body if they didn’t have their own to look at as a baseline, and only had the sanitized public education curriculum a source.
Simple: They wait for the stork to show up with their kid after coming into direct contact with someone else. While feeling really bad about it and fearing that they will soon be in hell. Thankfully, they all died out due to not knowing how to reproduce. And then kids/people will just download the tools off the darkweb. Pushing it further underground isn’t going to put the jack back in the box. There may be more comments in this discussion. Without JavaScript enabled, you might want to turn on Classic Discussion System in your preferences instead. Court Clears Researchers of Defamation For Identifying Manipulated Data Hasbro CEO Claims All His Friends Use AI For D&D, Signal To Embrace It „Aww, if you make me cry anymore, you’ll fog up my helmet.“
— „Visionaries“ cartoon