· 149 comments · Save ·
General Mar 26, 2026 at 3:55 AM

Teens get probation after using AI to create fake n**** of classmates

Posted by RedDalmatian885


https://apnews.com/article/artificial-intelligence-deepfake-lancaster-ai-5eccb10ae81244fe475a32867f9ca2c9?utm_source=copy&utm_medium=share

🚩 Report this post

149 Comments

Sign in to comment — or just click the box below.
🔒 Your email is never shown publicly.
CatholicSquareDance Mar 26, 2026 +1482
hundreds of images of over 50 underaged classmates. if this is what 2 incompetent teen boys can do alone, imagine how bad the underground market for this is.
1482
Niceromancer Mar 26, 2026 +345
It's insane, the dark web changed overnight from a place you could find anything you wanted to nothing but crypto scams and CSAM.
345
Reality-Umbulical Mar 26, 2026 +303
Thats what the dark web always was
303
bmann10 Mar 26, 2026 +230
They mean drugs. They liked buying drugs.
230
Show_Me_Your_Cubes Mar 26, 2026 +32
and p****** copies of game of thrones
32
Previous_Link1347 Mar 26, 2026 +58
Sure as hell don't need the dark web for that.
58
Environmental_Day558 Mar 26, 2026 +77
I know right, I'm trying to think of a time where the dark web was never a haven for illegal activity. 
77
Niceromancer Mar 26, 2026 +46
I mean everything there is illegal, but its changed. It used to be you could find shit you were looking for pretty easily. Now EVERYTHING is a f****** advertisement for some kind of scam.
46
pichael288 Mar 26, 2026 +38
Last time I used it was silk road days, but even then finding things wasn't easy, there is no deep web Google. Had to already know where you want to go.
38
ih-shah-may-ehl Mar 26, 2026 +10
Makes sense. If I was acting on behalf of the government, I'd understand that stopping it is impossible but it's trivial to flood it and fill it with honeypots.
10
kaisadilla_ Mar 26, 2026 +6
The deep web was promoted by CIA. It gave them a safer way to communicate simply because their communications would be hidden in a sea of genuine content.
6
BrofessorLongPhD Mar 26, 2026 +4
There was an article I read about 10 years ago, but this is proposed to be actually the best way to ensure privacy in this day and age. Encryption and secrecy are of course good practice too, but with how digital all of our lives are, someone who’s persistent enough will eventually catch you in a slip and access sensitive materials. This is of course a major problem for government entities who need to protect their information. The proposal is to instead of trying 110% super secret lockdown is to have known vectors among insiders where the information is flooded with a lot of trivial details. I think it was a guy who was being spied on or something. He purposely walked around the city taking pictures of completely nonsensical things like public bathroom toilets and building windows. Among those things might actually be something useful for the interested party, but the rest are basically just breadcrumbs leading nowhere. It’s the same ancient war strategy where you have a plan of attack, say from the north, and then spread rumors that you’re also attacking from the south, west, and east. Or maybe you’re not actually attacking at all. The point is not to hide the war plans, but mingle it in open sight along with so many other possible options that your adversary can’t effectively account for all contingencies.
4
FatBoyStew Mar 26, 2026 +4
There are plenty of legal things to buy off the darkweb. Its a different set of networking protocols. Nothing about the darkweb itself is inherently illegal/bad. Now that said, you could definitely find anything you'd want if you knew where and how to search.
4
Andoverian Mar 26, 2026 +6
Has it actually changed, or have you just gotten better (or at least more wary) at identifying scams as you've gotten older and more experienced?
6
Niceromancer Mar 26, 2026 +6
The scams etc have gotten so aggressive it's a struggle to find anything anymore.
6
Dimatrix Mar 26, 2026 +12
That’s not true! It was also a great place to buy weed
12
DeepInTheSheep Mar 26, 2026 +117
I work in the the field that tracks, and provides detailed info on this sort of thing. We've seen LE referrals skyrocket in the past 3 years exponentially due to AI. Thing is, those trading it believe the images they createto beabovethe law - they arent. They don't even try to hide their actions anymore - no proxies, no VPN,juststraight off their residential IP. It's ridiculously easy to find out who they are and provide the evidence. An insane amount of these criminal referrals are on what most would consider as kids. However, I've worked in this space for 18 years and have less than zero sympathy for CSAM traders regardless of age or intent. Punish them all and punish them hard.
117
pyrhus626 Mar 26, 2026 +30
On average criminals aren’t that smart, makes sense to me when it’s a new technology with initially little to no rules or regulations. The barrier to entry for trying to get AI to make something is lower too than navigating tor without walking straight into scams or honeypots, so again makes sense it would catch a lot of the dumber ones. Most of them probably have no idea what tor or a vpn are. They just saw news or ads about AI image makers that can make p*** sometimes with faces you give it, and immediately tried using it for kids.
30
CremCity Mar 26, 2026 +26
I’m really curious about this; You say LE referrals are skyrocketing. But are arrests? I’m really hoping you say yes. If it’s not a yes maybe don’t respond lol. I need at least one piece of uplifting news.
26
DeepInTheSheep Mar 26, 2026 +10
Yes. We have a wall of shame that is running out of room!
10
Niceromancer Mar 26, 2026 +11
That tracks. AI made it so any idiot pervert can craft their perfect AI p***. And of course these people are addictive types so not only do they mass produce this shit, they try to spread it as much as they can.
11
anengineerandacat Mar 26, 2026 +1
I mean that's generally how it's always been, just most services went online so application software is less common and movies/music/etc due to streaming have required less overall pirating. Gone are the days of trying to download some MP3 and finding out it's a movie with two girls and a cup.
1
Warcraft_Fan Mar 26, 2026 +1
Isn't possession of fake child p*** still child p***? Those idiots could be marked as pedophile which could cause issues in school for the next 5 or 6 years and cause issues with their future career path.
1
JaccoW Mar 27, 2026 +1
Setup is literally 10-30 minutes on even a decent gaming PC and then it's just a case of collecting reference material of the victim and p*** bodies to paste them onto. Each taking just a couple of seconds to make. This can be just an afternoon of work. And there are certainly people in discord channels who do it of famous people. Hell, that's probably going to be even easier because there is so much reference material (pictures of their face) online.
1
Exact_Patience_9767 Mar 26, 2026 +1019
Wow, the future of AI looks so bright. I'm filled with hope.
1019
Krewtan Mar 26, 2026 +293
Glad my electricity goes up every year for this. 
293
1877KlownsForKids Mar 26, 2026 +41
And my aquifers getting polluted 
41
54fighting Mar 26, 2026 +65
Who said it’s like if the salmon invented the grizzly? Why did we do that, mate?
65
Future-Table1860 Mar 26, 2026 +24
I feel the grizzly and salmon have a more balanced and natural relationship. The salmon are still a thing after millennia.
24
shouldco Mar 26, 2026 +5
The salmon invented the dam m
5
za72 Mar 26, 2026 +21
All that computing power to research cancer but in actuality it will just creat more
21
ThePlanner Mar 26, 2026 +15
Don worry, the administration is trying to prevent states from regulating AI.
15
kaisadilla_ Mar 26, 2026 +2
Of course the Trump administration doesn't want to negatively impact the amount of child p*** produced.
2
8livesdown Mar 26, 2026 +14
People have photoshopped fake n**** for decades. AI simply allows incompetent people to do it.
14
dog_of_society Mar 26, 2026 +81
It's like publishing the exact recipe for how to make poison. Sure, some people will do it either way, but there's no reason to make it easier.
81
thepianoman456 Mar 26, 2026 +2
I’m over here shitting on every single pro-Suno AI post I see on Facebook lol… cause F*** generative AI in the arts. It’s basically sanctioned theft, as well as robbing people of their potential creativity.
2
Weird_Personality150 Mar 26, 2026 +1
Well I just found pictures of you filled with something else! /s
1
L0rdSnow Mar 26, 2026 +338
I have a teenage daughter and this terrifies me. Even if she listens and follows all the online safety rules we have, she can be a victim in about 30 seconds if someone takes a picture of her. Hopefully the boys are getting some kind of counseling so they understand what they did and how it effects the victims.
338
BannedMyName Mar 26, 2026 +339
I f****** promise you the boys are not being counseled like that at all. Public schools are barely hanging on by a thread and the private/charter/whatever schools will protect the worst of their kind for funding.
339
menagerath Mar 26, 2026 +121
We also have a culture that doesn’t give a f*** about this kind of stuff. We tell people that they’re stupid for being caught, not that they did something wrong.
121
DreadyKruger Mar 26, 2026 +32
It’s not public schools job. It’s the parents. Schools have a lot of responsibilities but let’s put the accountability to where it belongs. Parents. Just like we see parents getting out in jail for their kids shooting a school or giving them access to guns. I have a teen son. I talked to him all the time about not being a creep, being respectful, being careful about what you send and share, etc.
32
OpheliaRainGalaxy Mar 26, 2026 +12
Public schools are the backup so we don't gotta deal with the aftermath of parents who aren't worthy of the name. I couldn't even get my ex to explain basic biology things to his own son! Poor kid was coping with morning wood, hearing jokes about that term at school, but had no clue what was happening or that those words meant what they do. And that's very much not a topic ya wanna leave for the stepmom to explain, like golly it happened because he needed the information but neither of us wanted to be there and I looked up at the sky the whole time. Before I put a stop to it, the boys used to play this "pin down and tickle" game that involved screaming No and Stop a lot. Eventually had to point out very firmly to the older boy that he should be able to understand exactly what he's teaching his little brother is normal and that it'll end with him in jail someday without a clear understanding of why. His eyes went wide and they never played like that again, both started getting lots more respectful overall. No and Stop are words with meanings we should honor, so absolutely shouldn't be practicing ignoring them or believe it's normal for them to be ignored. I hate to think how sideways those kids would've turned out if I hadn't married into the family at just the right time and started getting them sorted out. Their dad was happy to play video games in another room while creeps on YouTube and Roblox raised his kids.
12
[deleted] Mar 26, 2026 +9
[removed]
9
nathanzoet91 Mar 26, 2026 +12
So even more so that they will sweep it under the rug. Less accountability.
12
PolicyWonka Mar 26, 2026 +1
These two kids who were convicted *are* from a wealthy private school?
1
every_twisted_wave Mar 26, 2026 +82
As a teen girl myself, most of the time you don’t even have a choice for photos. My school made us take group photos of the school’s high achievers and if you took part in major volunteering events for their social media. There’s probably at least twelve images of me on their social media alone. I used to be embarrassed because my hair looked a mess but now I have this to worry about… at least I’m not concerned over my hair anymore.
82
Normal-Rope6198 Mar 26, 2026 +16
I think it’s really weird that people post a ton of pictures of their children growing up online because it’s super creepy how much information is out there if you don’t actively try and obfuscate your personal information. I get why they do it but i definitely won’t be and I even took most the pictures off my social media with pictures of as much as even my face.
16
absloan12 Mar 26, 2026 +6
You can have your parents speak to administration about how you do not consent to your photo being shared. Unless you signed some waiver upon registering that has a social media policy, you can tell them no. Same thing goes for workers who are being forced to use retinal or face ID clock in systems. You have a right to your identity. And you can not consent to their use of your image.
6
Normal-Rope6198 Mar 26, 2026 +14
I guess one way to approach it is that now if you legitimately have n**** leaked for whatever reason it’s really easy to just claim it’s all ai
14
1829bullshit Mar 26, 2026 +10
Same. And knowing that these little fucks are getting nothing more than a slap on the wrist for the trauma they induce is infuriating. Really makes one think about how else justice can be served.
10
askalotlol Mar 26, 2026 +13
Counseling? Sure, as long as it takes place in a juvenile detention center. They victimized dozens of girls. They created child p**********. At 14, they were very much old enough to understand the gravity of what they were doing. The crime is heinous, they should be incarcerated.
13
2cats2hats Mar 26, 2026 +1
> Hopefully the boys are getting some kind of counseling so they understand what they did and how it effects the victims. Agree. Curious what the parents take is? Some are angry, some are in denial their precious child would do such an thing....my guesses.
1
PPMD_IS_BACK Mar 26, 2026 +1
Even if those disgusting boys did there will be more that take their place at being disgusting. Will never end.
1
Allobroge- Mar 26, 2026 +1
The worst part is I have zero clue of any potential solution to this. Every attempt at banning any internet based application has failed before. 
1
RepresentativeCod757 Mar 26, 2026 +31
How's AI doing on the cure for cancer?
31
WolfWraithPress Mar 26, 2026 +17
Sorry, all I can do is develop a new type of cancer that charges you rent.
17
recyclopath_ Mar 26, 2026 +4
They basically stopped working on any of that because they are throwing so much money at gen AI that there's nobody with any AI expertise left to work on actual good.
4
cribsaw Mar 26, 2026 +3
It’s giving blastemal cells huge t*** and cumshots
3
FuzzyEmployment5397 Mar 26, 2026 +184
Harsher punishment than what Elon got for the same crime
184
cptbeard Mar 26, 2026 +35
seems to be always the thing that if you do some crime in big enough scale the law stops working. same happens with many other things too like J. Paul Getty has that famous quote "If you owe the bank $100, that's your problem. If you owe the bank $100 million, that's the bank's problem." also with lying, if some generally reputable guy gets caught for one small lie or maybe just inaccuracy/misunderstanding it might have a huge impact on their reputation, but then if some massive a-hole says nothing but lies people just tolerate it and work around it (case in point DJT). there's probably some named principle that covers these under one umbrella.
35
bmann10 Mar 26, 2026 +5
The law does indeed work it’s just that prosecutors being elected officials is an insane and stupid practice and so the law is never enforced at that level. They run on being “tough on crime” which is code for “I promise to put the most brown people in jail possible” and part of that is just numbers. Why prosecute one rich guy you will probably lose against for years, when you can prosecute like 3000 poor people with that same staff and money, and have good numbers at the end of the year. Your constituents aren’t going to vote you out for not going after the rich guy, they again only care about how many brown people you lock up in a majority of the US. It is in part due to this that the DOJ exists but currently it is run by people who actively protect the worst members of the ruling class so not much is gonna happen there.
5
OpheliaRainGalaxy Mar 26, 2026 +1
I recently got to watch two very moral folks, an old lady and a young man, discuss our justice system. Both have terrible things on their records that they absolutely did not do. But they were given a choice between signing a paper saying they did it and getting to go home, or just rotting in jail for god knows how long. The old lady was my auntie. When the "justice system" stuffed her in prison, they took her away from her baby. The kid got passed around the family like a Hot Potato, grew up to be an abusive drunk whose kids mostly hate him. So that's at least three generations of damage. My family apparently "earned" that because a young mom grew a plant and sold it. The young fella had a similar story, but on the opposite side of the country and at least 40 years later. So as far as I can tell, the system for the poor isn't that far off from what they did to that queen on Game of Thrones. "Confess or we'll continue to hold you in terrible conditions. If you confess, we'll let you go, though only after parading you in the streets and announcing your crimes to the entire community and totally ruining your reputation." Except it seems to usually be innocent people, or at least mostly so. Both folks were very straightforward about which parts of their records they're actually guilty of, my auntie really did sell weed and that fella really did break windows after getting roofied at a bar.
1
permalink_save Mar 26, 2026 +45
So glad AI is hnregulated and can generate CP meanwhile anything I do that touches technology wants me to upload a drivers license. If you haven't heard, theyre trying to make it that using a device at all, at the OS level, requires verification. But AI can do anything it wants, got it.
45
Careless-Gain6623 Mar 26, 2026 +23
The people in power are pedophiles and you are not. Why would they make laws that infringe on their lifestyles?
23
czs5056 Mar 26, 2026 +67
Seems a bit lenient for making child p***.
67
tazztsim Mar 26, 2026 +17
A lot of it too
17
JaccoW Mar 27, 2026 +1
The problem with these tools is how easy it is to do once it is set up. Take a single picture of a person's face. Realistically paste it onto another body in literal seconds. A couple of minutes if it is videos. You can have an extensive collection in a few hours.
1
PolicyWonka Mar 26, 2026 +12
Charges and sentencing usually are when the defendant is also a child. Some studies suggest that ~20% of children 15 and older have created or shared CSAM. Of course when you’re a kid, you’re probably not thinking that “sending n****” is a crime because of your age. This is something that just about every school district has had to deal with at some point or another.
12
chevybow Mar 26, 2026 +6
I feel like there’s a difference between sharing n**** of yourself (to another person your own age, presumably a partner) and generating AI deepfake p*** of your classmates without their consent..
6
fullmoon63 Mar 26, 2026 +44
This is exactly the kind of thing people were worried about when AI image tools blew up.
44
CRAkraken Mar 26, 2026 +61
The CEOs of these AI companies need to be charged for facilitating the creation of CSAM or this will never end. Until the bubble pops.
61
aradraugfea Mar 26, 2026 +24
This is what gets me. You’ve got websites left and right putting crazy barriers for a child to even use the site, large social media banning even KIND of “adult” art (as in a drawing) from their platforms, even p*** sites refusing to host anything where the subject (photographic or illustrated)’s age is in question. All to avoid the legal/financial crack back that would come otherwise. But you can point Grok at a photo someone posted of a literal, actual, 10 year old child and say “put her in a bikini.” You can feed one of these programs a middle school yearbook photo and get it to spit out child p**********. And nothing happens. We just roll with it. It can’t be the AI’s fault because it doesn’t have agency. It’s not the company’s fault because they didn’t make it, they just provided the tools and hosted the image on their servers. But if I upload something ELSE illegal to these same websites, the website’s legally liable? I’d class it as the same thing we saw with electronic cigarettes, where the laws were so specifically about the actual tobacco that there was a legal loophole that had to be closed, but I’m pretty certain if photo realistic n**** of a middle schooler are gonna get someone in trouble even if the girl’s got 6 fingers and a hairline that blends with her ear. US law does make a distinction between artistic depictions and the real thing, but that none of the sites generating this stuff on command are in any way culpable the same way the law would hold YouTube culpable if they let me upload the MCU is one of those “make it make sense” moments. Other than “they’ve got enough money that they’re single handedly providing the illusion our economy is growing”, tell me why these companies aren’t being forced to either change their models so they stop making child p*** (even if the prompter asks pretty please) or being taken down until they can change the models to stop bypassing their own content filters when the prompter says the magic word.
24
fauxedo Mar 26, 2026 +6
There nothing stopping the people running these AI engines from using their own tools to make sure their users aren’t breaking the law.  If AI can be used to generate these images, it can be used to screen them before giving them to the user. 
6
Blockhead47 Mar 26, 2026 +2
There is more to read than what i pasted below. See link: https://www.justice.gov/criminal/criminal-ceos/citizens-guide-us-federal-law-child-p********** >Federal law prohibits the production, distribution, reception, and possession of an image of child p********** using or affecting any means or facility of interstate or foreign commerce (See 18 U.S.C. § 2251; 18 U.S.C. § 2252; 18 U.S.C. § 2252A). Specifically, Section 2251 makes it illegal to persuade, induce, entice, or coerce a minor to engage in sexually explicit conduct for purposes of producing visual depictions of that conduct. Any individual who attempts or conspires to commit a child p********** offense is also subject to prosecution under federal law. . I wonder if below would legally apply as an interstate crime since the ai servers are likely not all within a state: . >**Federal jurisdiction is implicated if the child p********** offense occurred in interstate or foreign commerce.** This includes, for example, using the U.S. Mails or common carriers to transport child p********** across state or international borders. Additionally, federal jurisdiction almost always applies when the Internet is used to commit a child p********** violation. Even if the child p********** image itself did not travel across state or international borders, federal law may be implicated if the materials, such as the computer used to download the image or the CD-ROM used to store the image, originated or previously traveled in interstate or foreign commerce.
2
EightGlow Mar 26, 2026 +5
Thank god we’re getting rid of the Colorado river for this
5
RightofUp Mar 26, 2026 +170
Interesting that 14 year olds boys creating ai generated n*** images of their female classmates would be called pedophile….
170
AhBee1 Mar 26, 2026 +11
Kids creating child p***.
11
RightofUp Mar 26, 2026 +29
Yeah…. By definition, anything they do would be underage. Doesn’t make them pedophiles.
29
colinisthereason Mar 26, 2026 +5
Not pedophiles per se, but absolutely sex offenders. This definitely isn't over, either. The families of these girls are absolutely going to sue the shit out of these boys and their parents
5
RightofUp Mar 26, 2026 +4
For what? Were they even disseminated? The article was incredibly lacking on a lot of pertinent details.
4
bosceltics23 Mar 27, 2026 +2
Distribution of CSAM? Hell it can be real pics from two minors who are dating and decided to send each other consensually their own n**** and most states consider it distributing CSAM regardless of intent/platform. Even if they have it saved on their own phones - that is CSAM.
2
NACITM Mar 26, 2026 +6
at the very least pedophile enablers.
6
Th3Batman86 Mar 26, 2026 +6
It was such a big deal when they took down Napster, and Backpage, and Pirate Bay. But making and distributing child p*** in seconds now with AI. Slap on the wrist if it’s a problem at all.
6
neondirt Mar 26, 2026 +2
Makes it pretty obvious that the only important thing is money (for the powers that be).
2
boopboopadoopity Mar 26, 2026 +13
>The defendants declined several opportunities to comment to the judge, who said he had not heard either boy take responsibility or apologize. This is disturbing?? Only one of the two lawyers has claimed their client is sorry as well. Part of me hopes there is a legal reason they're advising them not to apologize because the alternative is so upsetting.
13
PolicyWonka Mar 26, 2026 +10
In the United States, an apology can land you in legal hot water. It can be construed as an admission of guilt. In this context, probably based on the advice of their lawyers due to potential civil lawsuits.
10
askalotlol Mar 26, 2026 +16
Probation is a slap in the face to the 50 girls they victimized by creating child p********** of them. This will haunt them for years. They should be in a juvenile detention center. > If they don’t have any additional legal problems, Brown said, the case can be expunged after two years. Disgusting.
16
TheGriffin Mar 26, 2026 +3
The girls who had images made of them should be the ones deciding what is an appropriate punishment
3
AhBee1 Mar 26, 2026 +11
Isn't this what Elon designed Grok to do?
11
King_James_77 Mar 26, 2026 +59
If the ai was so smart, it wouldn’t create n**** of unconsenting people. It would raise ethical questions and refuse. But hey, this is the future of ai I guess. Unethical perverted content. How f****** sad. Couldn’t be used to help save lives or sum.
59
Visual_Collapse Mar 26, 2026 +26
Not how "AI" works =/ It's just a big pile of linear algebra "trained" on lots of data. It don't have ethics. It don't even have concept of ethics. What you need is "AI" that newer "learned" how naked people look.
26
BloatedGlobe Mar 26, 2026 +3
To add to that, it needs to be trained on a lot of images harvested from the web, too many to be checked by a person. So the training data inevitably contains CP, which means that it can be generated. You can give GenAI guidelines to try and prevent them from producing similar images, but if it’s in the training data, users can get around these guidelines.
3
Environmental_Day558 Mar 26, 2026 +52
AI isn't a sentient being with morality, it's only good for what you train it for and it is used to save lives. GenAI was used in the medical field for imaging before chatgpt became public. The thing is the general public doesn't have use for using models that can enhance radiological imagery and diagnose illnesses, but they do have a use for it when it comes to making memes and p***. So this is the side of AI we are going to see much more often. 
52
TimothyMimeslayer Mar 26, 2026 +5
It could be used to see what clothing would look like on you before you order it online. That is a good use case.
5
WolfWraithPress Mar 26, 2026 +4
"A.I." literally does not possess intelligence. You have been tricked by the naming convention pushed by corporate interests.
4
elbenji Mar 26, 2026 +7
AI doesn't think. It's strings
7
ReklisAbandon Mar 26, 2026 +1
At least until we get a functioning government again that might pass some laws regulating AI.
1
PolicyWonka Mar 26, 2026 +1
There are specific models that exist to only create pornographic content. There are entire companies built around those models nowadays.
1
Evakuate493 Mar 26, 2026 +7
These punishments need to get severely worse. Teen or adult, this shit is nothing but toxic.
7
flamedarkfire Mar 26, 2026 +9
Don’t want to ruin their futures as lackeys of the feudal techbros
9
Kramerica5A Mar 26, 2026 +3
This happened about 10 miles from me, and it's not even the incident they're talking about in the article. It's going to get bad, really quickly. 
3
mobbdeap Mar 27, 2026 +3
The future is so bright, I have to wear shades.
3
vincec36 Mar 27, 2026 +3
They ain’t gonna do anything to help women or children. But they will pass laws to surveil you with the excuse “ to protect the children”. We all saw how the E files were covered up, they don’t care about kids
3
BojackWorseman13 Mar 26, 2026 +10
Why the f*** are judges so lenient on little cretins like this and Brock Allen Turner. It’s cold but this should have ruined their lives and futures.
10
penguished Mar 26, 2026 +4
I'm just reminded of stories where the weirdo teens like this get their slap on the wrist and go on to do much more serious crimes as adults. 60 hours of community service does what exactly to change their minds?
4
MrFizzbin7 Mar 26, 2026 +4
Here is my question, why aren’t the tech companies liable for decimating child p**********.
4
Fanfics Mar 26, 2026 +49
There are some interesting questions here about where you draw the line between deepfake revenge p*** and high schoolers doodling in their notebook. Like how realistic does it have to be? If you're too good at drawing does it become a crime? Does it have to be ai? Is making adult images in photoshop illegal? What about physical collage? As gen alpha grows up with image generators on every phone these questions are going to stop being hypothetical nitpicks and start being real boundaries we have to draw. I dunno about any of those questions. This article is weirdly sloppy for the AP, light on details about the actual crime committed here - it doesn't even mention what the specific charges are. Is this because the people involved were minors? It goes out of its way to say that the defendants were accused of being "pedophiles," which, yeah, I *hope* these 14-year-olds are mostly interesting in high school girls. Is it a distribution problem? Were these photos posted to social media? What about the sites hosting them, are they going to be pursued? The article isn't clear, the only mention is of a defendant saying they were never meant to be shared and a law that passed recently mandating sites take them down. At first I thought all this vagueness was because the proceedings were sealed, but "Juvenile proceedings in Pennsylvania are normally closed, but this was opened by the judge, providing an unusual opportunity for the community to be seen and heard." So why is this article full of emotional accounts and almost no information on what actually happened?
49
Tibbaryllis2 Mar 26, 2026 +23
Add to your list of questions the confusing and appealingly contradictory nature of regulations. Just recently (today?) the US Supreme Court ruled ISPs aren’t liable for any illegal downloads coming through their systems *even if they can identify that traffic as likely being of that nature* Then you have about half the states requiring uploading actual ID to view adult websites. But anyone can use these AI generators to make CSAM. Then add in the massive amounts of CSAM in all but name in things like anime/manga fandoms. What these kids was clearly wrong and they should have known it was wrong, but also I could see where someone growing up with this technology might have a bit of an issue finding the line. Edit: also add from today, the social media companies like Meta being found negligent in social media addiction trial.
23
Niceromancer Mar 26, 2026 +54
This isn't in any way "doodling in your note book" and even trying to compare the two is bad faith at best. This f***** created hundreds of images of his underaged classmates without their consent using AI.
54
Forgettheredrabbit Mar 26, 2026 +37
Ok but they weren’t really equating the two. Their point was that we’re going to have to set boundaries and rules for things we’ve never really had restrict to before due to AI moving the goalposts on what’s possible.
37
Niceromancer Mar 26, 2026 +18
Im going to just kinda suggest that mass manufacturing p*** of your underaged classmates (he made over 200 images) should be a hard restriction. AI should never even be allowed to generate n*** images of anyone ever, if the prompt involves make this person s***/naked/whatever, the AI should respond with a hard no. If you want p*** of someone and they consent to it, get a god damn camera. This should be punishing to all involved, any company ai allowing this should be fined harshly, and anyone found doing this should also suffer consequences. It its absolutely absurd that this is such a struggle to get under control, all because some rich fucks think they can make more money off of sick perverts.
18
KhonMan Mar 26, 2026 +3
You’re still not addressing the point. AI doesn’t do anything that photoshop couldn’t already do, it just makes it easier. So sure, have guardrails on AI to make it harder or impossible (if you can) to do this. But fundamentally it would be logical to treat anyone using photoshop like this the same way. In my opinion you won’t be able to successfully stop creation with either laws or technology, so the laws should be around banning distribution.
3
pyrhus626 Mar 26, 2026 +3
I’d put the legal line at distributing, in which case it falls under normal CSAM rules. I imagine a talented teenager at painting put up passingly realistic hand drawn n**** of girls would also be punishable the same way.
3
Jscapistm Mar 26, 2026 +8
It would not. Human made artistic depictions are protected. The teens could be punished by the school for other things but it is not covered by CSAM because it is not an actual image of a underaged individual but an imaginative depiction of one and a drawing no matter how good is clearly not the real image. Even photo realistic drawings are when you look closely different because of the nature of how the art is created on the medium. Where you would have a more interesting case is if a digital artist used digital tools to hand paint a photo realistic depiction. In that case it isn't obviously unreal, but it is still imaginative and a ultimately human artistic creation, so it might come down to if the person possessing the image could show that they made it by hand so to speak. Now it not being covered by CSAM laws doesn't mean it isn't covered by defamation laws. But probably because we'd have to throw out half of classical art, drawings, painting, and sculpture aren't ever illegal on their own.
8
Phronias Mar 26, 2026 +14
They are aspiring to be president
14
bakeacake45 Mar 26, 2026 +5
Grave mistake and extreme prejudice against the victims by this so called judge. Those boys and their parents should be in jail The judge basically told th3 victims they should have laid back and enjoy it
5
WolfWraithPress Mar 26, 2026 +26
An incel technology, invented for incels, to enrich incels.
26
FThePack Mar 26, 2026 +14
Lots of AI shills and Pedophiles patrolling this post. Gross.
14
aftocheiria Mar 26, 2026 +10
Actually insane to me that so many people are defending this. Is this a psyop? Wtf is going on!
10
Creepy_Arm_1174 Mar 27, 2026 +2
Not good, I’m also not at all surprised. This isnt even the tip of the iceberg of some of the effects AI will have on society. I’m just waiting for the day that something even worse happens and AI gets reigned in legally
2
The_Sum Mar 26, 2026 +12
Probation...? Really? "The defendants declined several opportunities to comment to the judge, who said he had not heard either boy take responsibility or apologize." and "Brown ordered each to perform 60 hours of community service, have no contact with the victims and pay an unspecified amount of restitution. If they don’t have any additional legal problems, Brown said, the case can be expunged after two years. As he imposed his sentence, Brown said that if they were adults, they probably would be headed for state prison. He said they should “take this opportunity to really examine” themselves." What a pathetic sentence. Their probation needed to be a complete banishment from technology until 18 followed by 80-120 hours of community service every summer break until 18. This sentencing was way too light and simply teaches boys to be better at concealing their activities.
12
Careless-Gain6623 Mar 26, 2026 +3
Should be a day community service for each image.
3
FuzzyJellifish Mar 26, 2026 +16
This is the answer. These “boys” knew what they were doing was very wrong, they expressed no remorse, and they ruined the lives of these girls. But f*** the victims cuz they’re just “pictures” and they’re just “teenage girls,” right? People in these comments seem to think a single sentence of probation is enough and “they’re just 14, how could their poor underdeveloped frontal lobes KNOW it was wrong??” Except the internet is forever, those pictures are forever, and they WILL reoffend. They shouldn’t be let near a computer until they’re 25 and their brains ARE developed if this is how they act at 14. This sentence is a slap on the wrist by a fellow male who just used “boys will be boys” as an excuse. One girl needed trauma therapy, many have expressed panic and anxiety attacks, several are terrified the pictures will pop back up when they’re trying to get jobs, many girls had to transfer districts. But they’re female so f*** them, they’ll get over it, right? We wouldn’t want to ruin the lives of these 14 year old teens over some silly pictures! Also, you incels and fellow 14 year olds can downvote people all you want. I hope one day you have daughters and it’s their face on some graphic p*** plastered all over their high school. Idiots.
16
InternetName4 Mar 26, 2026 +11
99% agree. They need real consequences and this ain't it. But I don't think it's fair to wish harm on the weirdo defenders hypothetical daughters, best to hope they don't have any. I think it's pretty optimistic to think they wouldn't say the same thing to their own child. On the other hand I do wish there was a way to make most men understand the horror of sexual violence and exploitation so I get why you said that.
11
Wanna_make_cash Mar 26, 2026 +7
A complete banishment from technology isn't viable in the modern times. There was a court case in my state like, 10-15 years ago where the courts sentenced a man convicted of CSAM offenses to literally never use a computer again for the rest of his life. The higher courts, back then, struck it down as a punishment that can't be given because even back then, computers were too integral for daily life and it was too vague of a punishment with how increasingly common computers were becoming. They kicked the case back down the lower courts to resentence the offender and they had to settle for making him install monitoring software on any devices he had instead and make it so he can only use computers for education and employment related reasons. If that was the thought process 10-15 years ago, I can only imagine similar things would be said now.
7
AstroBullivant Mar 26, 2026 +3
This problem is only going to get worse.
3
Infamous-Sky-1874 Mar 26, 2026 +13
> The boys were 14 at the time. They admitted this month that they made about 350 images, showing at least 59 girls under 18, along with other victims who so far have not been identified. And someone decided that probation was the appropriate sentence? Edit: Oh look the pervert defender squad has rolled in with the downvotes.
13
NKD_WA Mar 26, 2026 +99
I don't know what the appropriate sentence is here, but non-violent crime committed by 14 year olds with clean records? Probation and community service is pretty standard and not at all surprising. Should we punish kids more harshly? Maybe. Not really an expert.
99
CatholicSquareDance Mar 26, 2026 +26
i honestly don't see enough details in the article to say if their punishment was fully appropriate. this was an awful thing to do, they dramatically impacted the lives of dozens of young girls. but like, as 14 year olds, could they even really comprehend how awful this was? i don't even know. their empathy was probably not developed enough to see through the layers of abstraction to appropriately understand the harm of their actions. but the harm was also extremely real, and it seems that they didn't express much remorse about it. maybe they deserve worse? i don't know.
26
FuzzyJellifish Mar 26, 2026 -4
Non violent to whom? These girls are in therapy and are having panic and anxiety attacks, losing sleep, and having to transfer districts. It’s always “non violent” when it’s coming out of the mouth of a man. Sexual crimes ARE violent and they include plastering p*** pictures of your classmates all over the school. Why is this even a debate? You all literally sound like perverts protecting perverts.
-4
BigMeatPeteLFGM Mar 26, 2026 +10
Violence doesn't mean harmed. It's means physically harmed using force. The girls were emotionally harmed.
10
TelevisionExpress616 Mar 26, 2026 +14
People don’t give enough agency to 14 year old boys. There’s enough free p*** out there to watch non-stop for a 100 lifetimes. Literally. Not to mention image generation of things that don’t involve people you know. This isn’t some stupid shit like getting high. This isnt a victimless crime. Going out of your way to generate some of your classmates is vile. It shows a complete lack of humanity, and a complete disregard as seeing those girls as human beings with their own feelings and body autonomy. F*** probation. I knew kids who got probation and community service just for underage drinking. Let them get Juvenile detention with a record cleaning at 18, they can get their GED after. And lock up the image generator ai execs too
14
TeamHitmarks Mar 26, 2026 +13
What do YOU think is appropriate?
13
Careless-Gain6623 Mar 26, 2026 +2
Juvie and mandatory community service. Names on sex offender list.
2
[deleted] Mar 26, 2026 +14
[removed]
14
mfmeitbual Mar 26, 2026 +20
They're teenagers. If you were to pick a single phrase that encompasses the things teenagers do, "dumb shit" would be first on most's peoples list. I'm not defending perverts, I'm saying they're dumb f****** kids and yeah, probation / community service absolutely was the appropriate sentence. I'm entirely unclear how more severe penalties - incarceration would be the next step up, right? - would help anything.
20
EpicRedditor34 Mar 26, 2026 +15
You guys give far too much grace to teenage boys. These girls will be traumatized for life.
15
FuzzyJellifish Mar 26, 2026 +10
They shouldn’t be allowed anywhere near a computer until they’re out of their “doing dumb shit” phase, for starters
10
janosslyntsjowls Mar 26, 2026 +9
How about paying for the victims' therapy out of pocket? Boys will be boys, consequences are just for victims
9
FeastForCows Mar 26, 2026 +10
"Oh look people didn't immediately agree with me so I gotta edit my post to acknowledge it".
10
MiaowaraShiro Mar 26, 2026 +1
> Edit: Oh look the pervert defender squad has rolled in with the downvotes. Nobody's defending perverts and that you have to strawman literally everyone responding to you with this kinda shows you're unserious and mostly care about satisfying your own desire for revenge than actual reformative justice. Stop letting your emotions rule you.
1
Blighton Mar 26, 2026 +4
part of the probation is instructing members of Congress on how to do it
4
Sobeman Mar 26, 2026 +5
What grok was made for
5
SuperPotatoThrow Mar 26, 2026 +7
They should have gotten more than a slap on the wrist. Doesnt mean prison for life either, but enough to discourage this kind of thing from happening in the future.
7
cribsaw Mar 26, 2026 +1
How were these fakes discovered? Were they sharing them or using them to harass people? Look, I’m not going to defend people making this shit. Especially when there’s literal children involved. That’s inexcusable, and you are making CSAM as far as I’m concerned. But if you’re going to be a pervert with images of adults, do that shit in private and keep it private. Why would you want to be responsible for destroying someone’s peace of mind like this is beyond me.
1
chefspork_ Mar 26, 2026 -7
Boys will be boys must be a strong defense.
-7
Zardotab Mar 26, 2026 +10
To be honest, at 14 I was both horny and stupid enough to try such a prank if the tech existed back then. Yes, I would deserve punishment, I won't deny. (We had to make cuneiform p*** back in my day, snuck it behind the oxen shack. I'm sure the Great Anu shall punish me, might explain the cataracts.)
10
Ninja-Ginge Mar 26, 2026 +6
Would you have spread those pictures around?
6
leohat Mar 26, 2026 +4
Locker room talk
4
← Back to Board