Discussion
The Deepfake Nudes Crisis in Schools Is Much Worse Than You Thought
colpabar: Usually threads like this fill up with comments about how "think of the children" is always a lie used to justify something draconian. I agree with that to an extent, but for those who think that applies here, is there _nothing_ we can do as a society to address this? I'd like a better answer than "let parents deal with it," and the whole "this wouldn't be a problem if america wasn't so puritanical maaaaaan" argument is total bullshit that completely ignores the young girls who get hurt by this.
b00ty4breakfast: I think in this instance the issue is that these tools shouldn't be available at all to anyone, adult or child. There are not many use-cases for generating porn from images of real people who aren't porn stars and most of them are not good.
slashdev: How much violence (government control) do you need to exert to block all of these tools? How successful have we been blocking pirated content?I don't think the genie is easily returned to the bottle and the cure may be worse than the disease.
phyzix5761: I used to think the same about pirated content until I traveled to a country that has been successful in blocking pirated content. Its impossible to even find a free movie on YouTube there. The government really cracked down on it and succeeded.
jacknews: I don't get it. Everyone knows that a nude is going to be AI. Even actual nudes can be explained away as just AI.Sure it's distasteful, but it's no different to cutting and pasting heads onto porn stars as was done plenty before.I'm not saying there's entirely no harm in that, it's obviously a form of bullying, but AI does not make it novel, or a crisis.
housebear: You don't "get it" because you are not a person who would ever be at risk of suffering the embarrassment, shame, and violation of something like this happening to you. These are not "porn stars" being talked about. These are children. In classes with the same children perpetrating these acts. Do you remember being a child? Do you remember the nuances and challenges of social dynamics from those years? Can you imagine (empathize with) being a young girl and having fake nudes of yourself sent around to others? I don't imagine you can. "Distasteful" doesn't begin to cover it.
jacknews: I have exactly that demographic kids.It is bullying.But AI does not make it more than that, or a crisis.
ceejayoz: That's like saying machine guns didn't change warfare because we had guns before that.
phainopepla2: You really don't get how distressing it would be to a teenage girl to have extremely realistic nudes of her spread around publicly, even if everyone knows they're AI generated? Did you try imagining yourself in her place, imagining the social world she inhabits?
UltraSane: It really is only as distressing as you let it be.
post-it: Yeah, we've pretty successfully gotten rid of online child porn from like front page search results and public social media, there's no reason why we couldn't do the same for this.
ryandrake: The article isn't about content on the front page of search results. The content in question is being circulated privately through chats. The proposed solutions, so far, have been the broad, nasty, invasive content scanning that nobody on HN wants (for good reason).
BikDk: Ah, yes - the moral ambiguity of sexualized western culture - somewhere between medieval moral repression and (tech-driven) liberalization.
gicadin: archive - https://web.archive.org/web/20260415152353/https://www.wired...
SV_BubbleTime: Heads up, for me this wasn’t a working paywall bypass.
Mezzie: There are broadly speaking two options, neither of which will ever be implemented. What will happen is young girls will learn to accept deep fakes are made of them and it's part of life, in the same way explicit sexual commentary starting at the age of ~9 is, or how men follow you when you go somewhere in public, etc. They'll accept nobody cares in the same way nobody cares if you're sexually harassed in other ways.The two options are either the people in power standing up for the girls or giving the girls the power to deal with it themselves.People who have power generally are benefiting from the structure in place and so don't want to change it and/or they don't want to do any more work. Expelling all the kids who create deepfakes would cause a lot of arguing from parents and people who are on the boys' side, and they just don't want to deal with that. It's easier to tell the girls to be quiet.The other option is setting up a system that rebalances the power. For example, if a kid gets caught making deep fakes, give their victim access to every single thing on their devices: Private messages, Discord chats, images, etc. and let the victim decide who and what to release their private information to. Not going to happen.Another reason nothing is going to be done is if we teach 11 year old girls it's not acceptable for people to do this to them, they'll carry that forward through their life and a lot of people who find it gross to create fake porn of children are fine with doing it to older women and they don't want to create women who create a fuss about it. There are a lot of people who think it's disgusting I was sexually propositioned when I was 10/11 but think it's fine I can't go for a walk in my neighborhood now without being bothered since I'm older.
alephnerd: Using Photoshop to paste a face on a nude body has significant frictions that a non-guardrailed foundation model eliminates.Most people are not technical in nature and cannot tell the difference between deepfakes and real photos and videos in short bursts.Basically, the friction needed to develop revenge porn has been dramatically reduced.
brazzy: > Most people are not technical in nature and cannot tell the difference between deepfakes and real photos and videosThey don't need to. The point is that eventually, everyone will just assume it's a fake, exactly because you can't tell, and fakes are easier to produce and thus more common than real leaked nude photos and videos. At that point, a deepfake shouldn't be more socially damaging than a rumor.
alephnerd: > Usually threads like this fill up with comments about how "think of the children" is always a lie used to justify something draconianIt also highlights HN's demographics. What younger women feel is problematic is viewed as trifling by a number of younger or middle aged men on HN (especially those without kids).
lo_zamoyski: Only to certain kinds of defective men/men suffering from ressentiment.No healthy man of good will wishes to see women get hurt or disrespected.
SV_BubbleTime: So I can skip ahead here…1. This has always been possible but the bar has been lowered to barely above typing “her, but nude!”. Opposed to a talented photoshop or pencil artist doing this previously. Is the issue scale?2. Lots of things have been illegal and immoral without tools. Assault for example. We don’t really have to deal with those things on a large basis. The difference here is effectively thought crime until distribution takes place and then it’s just another form of assault right? We already have laws on bullying and assault, no?Like I said… skipping ahead, let me guess, For The Children; we must block local models, AI except for The NYSE Chosen Ones, encryption, and must have digital ID to use everything. Tell me this article is in favor of anything else.As soon as you figure out that everything you read at large outlets, esp those owned by Condé Nast is directly written to affect a stock price somewhere it becomes a little exhausting.
post-it: > Like I said… skipping ahead, let me guess, For The Children; we must block local models, AI except for The NYSE Chosen Ones, encryption, and must have digital ID to use everything. Tell me this article is in favor of anything else.No, delisting online nudify apps will take care of 99% of this. There's no reason for them to exist.
ceejayoz: Thank goodness middle schoolers have fully developed adult emotional regulation, then.
UltraSane: Well then they should be TAUGHT this. Because the technology isn't going away.
orwin: That should count as distribution of CP and be punished like it is, each time. I guarantee you the first time a boy will be faced with 15 years, the media will run hard with the story and it will calm everyone down.
alephnerd: > should count as distribution of CP and be punished like it isIt should be treated as CP and revenge porn, but the issue is legislation surrounding these in a number of cases doesn't treat digitally altered images as within scope of CP or revenge porn.Additionally, platforms like Grok are taking advantage of this oversight by arguing that they do not need to add guardrails.Adding basic guardrails like not generating a sexualized image without identity verification or preemptively blocking questionable prompts would dramatically reduce this problem.
avaer: As someone on the spectrum, I really don't understand where society draws this line. Seems self-contradictory to profiteer off of sexualized mass + social media (totally OK) but then get up in arms when that power is given to the masses (deepfake nudes).It seems more about power and money than anything else, and the moral grandstanding/outrage is the manipulative icing on top.
undersuit: These children don't know they are being sexualized. Yeah the adults in the room are exploitive and manipulative but this should literally be a "think of the children" moment.
avaer: That's what I'm saying.Children are being sexualized and exploited left and right on places like Insta, Youtube, often even encouraged by their parents as it can lucrative. And this is totally okay and encouraged.But when kids start sexualizing each other it's news and a big problem.I'm saying there is a strange standard applied where some forms of exploitation are encouraged. And I don't get it. But I get why it's confusing the kids.
embedding-shape: What country was this specifically? I have yet to hear a country that successfully eliminated 100% of piracy, so very curious to hear what country this is. Obviously, most piracy sits outside of YouTube, not sure why you're using that as an example, it's a US property and you're clearly not talking about the US.
user34283: While that might be the case for now, as the offender would need access to a PC with a NVIDIA GPU, that will not hold for long.Once you can run decent image generation models on your phone, what shall we do? Prohibit apps that allow you to run custom models? Add a mandatory safety check?It seems like a pointless exercise to me - better punish the actual crime rather than try to regulate the tech.
UltraSane: Then any similar regions in latent spaces would also count as CP, right?
red-iron-pine: if we ever get to that point, where it is no longer capable to distinguish true from falsity, they're done as a society.this is not a good, nor should it ever be, an inevitable thing
red-iron-pine: how do you teach someone whose brain is still developing?"hey kids, get used to being exploited sexually, as it would be too expensive to require massive multinational corporations to bother to regulate AI"