· 125 comments · Save ·
News & Current Events May 15, 2026 at 9:40 AM

More than 70 million warnings sent to people searching for child sexual abuse content

Posted by Samski877



🚩 Report this post

125 Comments

Sign in to comment — or just click the box below.
🔒 Your email is never shown publicly.
PutMindless6789 9 hr ago +500
I got sent one looking for the Baby Merchant song from Cop Rock.  It is really easy to search for a suss collection of words inadvertently, and trigger the system. Lol. 
500
mhornberger 9 hr ago +1
It's also that the filters get more restrictive and bizarre. People have had police called on them for pool photos of their own kids. Someone out there realized that pedos may sexually enjoy otherwise normal pics of kids at the beach, pool, etc, and decided that those pics are suspect. It can turn from a concern for children to a moral panic very quickly.
1
PM_me_Henrika 9 hr ago +1
The problem with those people is why aren’t those kids working in their coal mines and factories. They think children and other inferiors need to put in their place, not having fun in a pool or whatever. This is just their first steps.
1
Hiiipower111 8 hr ago +1
Trying to fk them is there first steps to child labor? Am i misunderstanding
1
JDeegs 8 hr ago +1
You are misunderstanding. The people theyre talking about are the ones reporting peoples pics of their own kids
1
WesternUnusual2713 8 hr ago +1
Oh to be fair that's cos paedos will actively find and use photos of kids for roleplay so I guess some places will be stricter on it. There was a small kerfuffle years back when a blogger got called out for constantly posting photos of her little girl in various states of undress on her blog  found out a paedo was using them for cp role play and decided that her right to post her kid online was more important than her kids images now being in the cp network. (Not saying at all that parents shouldn't post their kids privately on personal social media but I do strongly believe kids should not be used as content, particularly when they are years away from being able to consent to it. But that's entire issue of it's own that is coming home to roost as blogger kids grow up and get away). 
1
mhornberger 8 hr ago +1
I just don't think that's a reasonable metric. We can't ban normal pics on the grounds that someone may get off on them. Someone out there will get aroused at pics of kids in swimsuits, in gymnastics leotards, in schoolgirl outfits. We don't ban all those pics just on the pretext that somewhere a pedo exists who will get off on them. No kids can legally give consent, but to meet that standard we'd have to ban all pics of kids.
1
integer_hull 8 hr ago +1
I mean it’s not a good idea with a ban or otherwise. It’s exposure to the largest group of strangers in existence. Personal details should be as scarce as possible, a law just might make that rule of thumb formal
1
himit 8 hr ago +1
Was in a thread discussing how entrenched this stuff seems to be and wanted to chime in that it was (disgustingly, in case that needs to be said) legal in ...Belgium? Netherlands? Until 1970-something? Went searching for the answer & got hit with a warning. I still don't know which country! (the warning was, however, at least somewhat on-topic.)
1
Clbull 9 hr ago +1
I mean "Baby Merchant" does sound like the name of a dark web pedophile forum...
1
Nolsoth 9 hr ago +1
https://www.babyfactory.co.nz/ Some companies really do need to rethink their branding sometimes. Absolutely no babies being made there and very few working there.
1
The_estimator_is_in 8 hr ago +1
Reminds me of the now defunct writing implement company, “Pen Island” and their website, penisland.com
1
lawilson0 8 hr ago +1
"It's simple, and catchy, and easy to remember."
1
Ok_Philosopher_6028 8 hr ago +1
Never forget the grapist.
1
Davido401 8 hr ago +1
Www dot Therapist dot com was another was it not?
1
Reylun 8 hr ago +1
I have a pin corkboard in my room appropriately labeled as pin island
1
throw2503 8 hr ago +1
> very few working there. So you're saying there might be 1 or 2?
1
SecondChances002 8 hr ago +1
Some things like that can be misleading for sure. I walked down the beauty aisle at the p******* the other day. The people there were average looking at best.
1
GauntletBloggs 8 hr ago +1
I didn't get sent any kind of message, but Google accused me of searching for child sexual abuse material because I searched for "nuk perfect fit teat" rather than "nuk perfect match teat" (I had a week old newborn at the time) I am still unsure how my original mistake constituted a search for csam but it was pretty unsettling and a bit offensive tbh haha
1
Feature_Ornery 8 hr ago +1
Yeah least I understood why Google accused me as I was trying to find a NSFW meme from 10 years ago and wasn't thinking when I started the search with "10 year old..." A rewording to "early 2010" got me what I want but it did both surprise and made me glad that at least Google will say no.
1
PurpuraLuna 8 hr ago +1
Cyberpunk fans in shambles
1
PeeDecanter 8 hr ago +1
I got sent one for googling which cruise lines were involved in the recent CP arrests. For some reason a lot of the articles just mention Disney even though it was actually 7-8 different cruise lines, but getting that info from Google (and its AI) was like pulling teeth
1
Initial_E 9 hr ago +1
Maybe that’s the new strategy. Dilute the meaning of child sex predator, and the orange guy doesn’t seem so different from you and me anymore.
1
fadedblackleggings 8 hr ago +1
Yup slowly degrading ppls mindset
1
KamikazeFox_ 8 hr ago +1
God, my 1 yr old fell off the bed and hit her head. Surprised I didn't get one, based off my Google search that night
1
Slowmyke 8 hr ago +1
"if i suddenly need a new baby, what should i do with the old one?"
1
KamikazeFox_ 8 hr ago +1
Lolol " can I return a baby without the receipt?"
1
Tricky-Gemstone 8 hr ago +1
I got one for looking into lolita fashion.
1
ScottScanlon 9 hr ago +1
"By placing more warnings across more online spaces, we can disrupt harmful behaviour at the moment it's happening and prevent countless children from being harmed.” Unfortunately I don’t think a warning label is going to deter these sick people.
1
ZenkaiZ 9 hr ago +1
Hey it worked for cigarettes
1
Chyvalri 8 hr ago +1
You wouldn't steal a car
1
The_estimator_is_in 8 hr ago +1
I’d download one, though.
1
Teantis 8 hr ago +1
Jacking up taxes on them so young people never started smoking in the first place was way more relevant to that.
1
whodisguy 9 hr ago +630
1600 Pennsylvania Ave
630
WPCfirst 9 hr ago +140
The search is coming from in the house.
140
indy_110 8 hr ago +1
[https://www.lucyfaithfull.org.uk/wp-content/uploads/2026/05/Project-Intercept-Impact-Report-2026-1.pdf](https://www.lucyfaithfull.org.uk/wp-content/uploads/2026/05/Project-Intercept-Impact-Report-2026-1.pdf) XAi isn't part of the group, those using Grok are still free to pedididdy unwarned, that number would be exponentially higher if you include Grok, enough to go to the moon I'd say, maybe elon needs to let that Sink in while he's sitting on his Path of Exile stream? I'm not familiar with the full scope of the Ai startup market, are their any other big ai companies not present on the list? It is better than nothing and puts the "move fast break things and no safety rails" crowd on the back foot having to actually do PR when this eventually breaks into mainstream news.
1
SlimJD 9 hr ago +1
I wonder where the other million warnings were sent.
1
Khaldara 9 hr ago +1
The Republican National Convention
1
Future-Raisin3781 8 hr ago +1
They not like us
1
BorntoBomb 8 hr ago +1
Whitelisted from that address
1
CatsBatsandHats 9 hr ago +134
I wonder how many of these searches are from legitimate researchers etc? Or is that 70m unique users, not for research & Investigative purposes? Big numbers regardless though. Bloody hell.
134
mhornberger 9 hr ago +57
I really doubt unique users. An automated system could send out one warning per search. Just as automated systems send out email warnings for copyright violations, in some parts of the world. One person could get hundreds.
57
Samski877 9 hr ago +14
That’s almost certainly warnings, not unique users. One person repeatedly searching could generate huge numbers. You’d hope most people stop after the first warning though, which still makes the scale pretty disturbing.
14
No_Excitement_1540 9 hr ago +31
Also, we don't know what "searching for child sexual abuse content" means... Considering the usual competence of automated filters and seeing that an old customer of mine (a pump manufacturer, including breast pumps for breast-feeding mothers) regularily finds their company website on "p***" and "sexual content" watchlists and filter lists i'd not rule out that searching for "diapers" might be a hit... :-( Unfortunately the listed URL seems to be down, at least from Germany... ("PR\_CONNECT\_RESET\_ERROR")
31
Tricky-Gemstone 8 hr ago +1
But careful about this. It has been easily triggered by benign things.
1
Zeeplankton 9 hr ago +1
My gut is telling me this stat is being taken / used for alarm. I would highly doubt it's individuals. It would make more sense smaller groups are repeat "triggers" across platforms, there's likely a large amount that are simply misfires. E.g tiktok alone has well over a billions users. 35 million alerts firing, mixed with a single user trying multiple terms, and plenty of misfires for all sorts of reasons. I don't think the actual stat really says much.
1
WesternUnusual2713 8 hr ago +1
62 million unique visitors to the r*** academy though. 70 mill feels low for this kind of content 
1
Nasu_Kaizoku 8 hr ago +1
No The academy thing was way smaller, like thousands, iirc? The 62 million was an OVERALL traffic for the p*** site that happened to have some of the academy-relatef videos uploaded, not members of the academy, not views for those type of videos, just the website's overall traffic The 62 million number really has nothing to do with the academy stuff
1
Educational-Bag-4293 8 hr ago +1
62 million was the number of visits of a p*** website, not unique visitors to a r*** academy. Please stop spreading misinformation.
1
Moist_Temperature69 8 hr ago +1
There's a Pakistani food truck near me called "Hot boyz chicken" and I got hit with a warning for trying to find their schedule on Facebook. I'm not sure if they've changed their name since but I am not gonna look it up again to find out
1
rising-buddha 9 hr ago +7
Well I would image it shouldnt be over even 1% or less, of that being done by researchers. This is a real issue that is finally getting the attention it deserves. 70mil is wild, im sure its wayyy higher then that, its a seriously global issue
7
Clbull 9 hr ago +1
Very, very few. These are likely global and not US-specific numbers, and you'd be very surprised at how many people are stupid and/or brazen enough to search for this c*** on the clearweb, let alone without the cover of a VPN. Not every nonce is some calculated tech-saavy mastermind hiding behind a dozen proxies and a Tor browser. I'd actually say that's a rarity.
1
redkorky 9 hr ago +53
Not sure if this was a warning, but i remember a stern message when I was using Google doctor for my 1 year old. Upon reflection, my keywords could have been interpreted horribly.
53
EgoTripWire 8 hr ago +1
Same for me when listing symptoms to diagnose my toddlers pin worms. 
1
FidgitForgotHisL-P 8 hr ago +1
There’s also countless stories of people getting accounts irrevocably suspended locked for “child exploitation images” for doing things like emailing a photo to their doctor, or showing their kids genitals on a stream,  using Gmail or Facebook messenger, because we’re all about Tele-health, and there is no way for doctors to flag this as “I am a legitimate medical professional and this is necessary for me to see”. I guess they should swap to using proton mail or something….
1
ponydigger 8 hr ago +1
there’s a yamaha keyboard i want to buy called the “yamaha cp reface”. so i typed “cp reface” into facebook marketplace the other day, and it takes me to another page telling me off for searching child p***. i was so confused, like, what? i want a piano, not to be a creep. so i searched it again, same result. took me too long to realize the model name having “cp” in it was the reason why. pretty sure it actually stands for “classic piano”.
1
BorntoBomb 8 hr ago +1
All for precenting child exploitation,  but thats not what this will be ysed for at all. This is going to be used as a broken taillight charge.
1
pineconeminecone 9 hr ago +1
My town has the initials C-P and I lot of businesses use those initials, so uh, there’s that
1
stripeyspacey 8 hr ago +1
We have a lot of issues with my job and our associated agencies because most of them have CP in their titles. Thing is, these agencies started out with getting services to people with *C*erebral *P*alsy, and began using "CP" as an abbreviation a long, long time ago. But now, if you search for their Facebook names, for example, you get a warning instead. It's been a fight for literal years to get their pages unflagged, we've only managed to get a few fixed in the last 3 years.
1
jwilphl 8 hr ago +1
I see it in /r/rollercoasters a lot when people are discussing Cedar Point, or maybe the Premier League with Crystal Palace.
1
Todose 8 hr ago +1
What. ? Where do you live?
1
jackleggjr 8 hr ago +1
Child P***, Indiana. Odd little town
1
Tu_mama_me_ama_mucho 8 hr ago +1
Corpus pristy
1
hugoise 8 hr ago +1
Cristal Palace?
1
cribsaw 8 hr ago +1
I’m skeptical of the recent conservative-backed push to “protect kids online.” I don’t believe the problem is as severe or pervasive as we’re being told it is. We’re being told that this shit is so prevalent that we should all be tripping over it every time we use the Internet, but has anyone actually unintentionally come across this content themselves, and who would admit they have? 70 million warnings, worldwide, is about what I’d expect from a world of 6 billion people using the Internet (about 1/4 of the world population does not). And that’s liberally assessing that the warnings were for unique users, not unique warnings generated among a much smaller group of people. It’s a lot, but we are we tracking warnings for people asking about other horrible things? We can all (hopefully) agree that CSAM and those who seek it out should be dealt with accordingly, but I think the issue is being leveraged as a means to another end. The real motive of these people is to ban p********** outright, which I will never support. They are pushing a moral agenda on the rest of us, and it’s easy to move people in the direction of a total and complete ban on p********** if you can misrepresent the problem of CSAM on the Internet. If they are successful, we’ll lose more than free access to p*** as their agenda develops further.
1
jwilphl 8 hr ago +1
There's good reason to be skeptical.  Republicans will cry about "protecting children," but if you analyze their actions, you realize they are all crocodile tears. Mostly they're too busy removing social safety nets that help kids and refusing to protect them from gun violence, among other things.   They don't take *action* to help children, so all the talk rings hollow and ultimately becomes pretense for something nefarious.
1
cribsaw 8 hr ago +1
The Republicans are also completely ignoring the Epstein situation. If they wanted justice for children, why have none of the men in the Epstein files been investigated and prosecuted in the U.S.? Why are the files still a big f****** mystery in the most obvious coverup in history? The same Republicans pushing ID laws to access the Internet’s most vanilla p********** are also covering up the most infamous child sex crime ring in history.
1
NicoToscani 9 hr ago +1
I know of 77 million people who voted for the best friend of an underage sex trafficker…
1
Bowsers 9 hr ago +1
Your friend should wait until they're over 18 age before getting into sex trafficking.
1
Danny_Mc_71 8 hr ago +1
I got a warning when I tried to recall Homer Simpson's line about the comic "Love is". I searched for "two naked eight year olds who are married". Oops.
1
cribsaw 8 hr ago +1
Well, I mean, c’mon dude.
1
croooowTrobot 8 hr ago +1
“Eight-year-olds dude”
1
Danny_Mc_71 8 hr ago +1
Yeah... In hindsight that probably wasn't such a great idea.
1
PurpleV93 8 hr ago +1
While I do think that this is a good measure and, without a doubt, there are millions upon millions of possible predators out there, I would take this info with a grain of salt. Like some other people already mentioned in here, lots of ordinary people get these messages aswell when using Google and using certain blacklisted words? I've got such a warning myself when trying to remember an anime name, and I searched using keywords like "warrior woman taking in child prince". I was looking for Seirei no Moribito, which I had watched years ago but forgot the name. And my reaction was "wtf did I do?". But I guess it's better to give those warnings to too many people than not enough.
1
digitaldeadstar 9 hr ago +1
Like others asked, I wonder how many were from more benign searches. Or even p*** related, but legal p***. I remember a few years ago trying to look up this one cam model by her screen name that had "baby" in it. Google gave me a stern warning. Not that there isn't a problem with this type of content being far more easily accessible these days. Or how many abusers there are out there. There's far too many.
1
AccomplishedGear7394 8 hr ago +1
I got the same warning looking for probably the same performer too. I was like never again.
1
Samski877 10 hr ago +36
The number itself is horrifying but honestly one of the scariest parts is realising how industrialised this problem has become online. People still imagine this stuff hiding in obscure corners of the internet when in reality tech companies are having to intercept searches at massive scale across mainstream platforms. The internet made it infinitely easier for predators to find content, communities and reinforcement from other offenders instead of remaining isolated.
36
dub-fresh 9 hr ago +154
This is partly ragebait. For example, I once searched "ped socks" in google (ped socks  are a type of no-show socks for wearing on your feet) and I got a message with this warning. In no way shape or form was I searching for anything remotely related to CSAM, yet I would be counted in this stat. 
154
lumynaut 9 hr ago +1
yep! any search on Facebook marketplace for Lolita clothing is flagged too.
1
Educational-Bag-4293 8 hr ago +1
One of my friends is called Lolita. Just calling her by her name on some platforms like insta makes the warning message about CSAM appear.
1
stumac85 9 hr ago +1
I have no idea if that's a brand but if it is, that's a terrible name for a clothing brand 😂
1
Jadodkn 8 hr ago +1
It’s a style. Yes, the name is deliberate.
1
wjodendor 9 hr ago +1
"gothic Lolita" is a style of clothing https://en.wikipedia.org/wiki/Gothic_Lolita
1
Tricky-Gemstone 8 hr ago +1
It's a fashion style that has been around for decades.
1
ruddsy 9 hr ago +1
Tell it to the judge bud 
1
CharlieTheK 9 hr ago +1
Why don't you have a seat over here?
1
mhornberger 9 hr ago +15
> one of the scariest parts is realising how industrialised this problem has become online. Such a thing would be harder when one had to find physical photos, probably in a shoebox at some guy's house. And without email or message boards, how would you find them? They existed, as they have since cameras existed. The problem was big even pre-web, since it was on usenet, and no doubt distributed on BBSs and similar before that. Any technology that allows people to network will let sick people contact and enable each other. Search engines have had trouble with people searching for how to murder someone, or how to get rid of bodies, as long as there have been search engines. People just don't realize that everything is tracked, recorded, and usually traceable to their IP address.
15
rangda 9 hr ago +1
It used to be NBD for people to publish books and magazines of n*** photographs of young kids, even marketed as erotica. Books like that were found in Michael Jackson’s house and famously used as evidence against him regarding those allegations. I saw some lady in a documentary years ago talking about the unease of seeing the kiddie p*** magazines next to the playboys at the newsagent back in the ‘60s or ‘70s, only the kiddie p*** wasn’t brown-bagged. The impression I get, as absolutely insane as it sounds now, is that images like that were seen as a more *sophisticated* (!!!) genre within erotica. Like fancier taste, vs a normal adult man wanting to look at adult women. Insane.
1
Samski877 9 hr ago -2
Exactly. The disturbing part isn’t that the internet invented these people, it’s that modern tech lets them find each other instantly and operate at industrial scale.
-2
YogurtClosetThinnest 9 hr ago +6
Gonna tell myself a lot of it is just people looking up "teen" p*** which is usually like 25 year olds lmao
6
Zoneator 9 hr ago +1
70 million is an insane number wtf
1
Naive_Confidence7297 8 hr ago +1
Probably half or more of those would be false positives surely. There is already people in this post saying they got warnings over the most stupid mundane shit usually when dealing with their children for legitimate reasons
1
Zanian19 9 hr ago +7
"From the tens of millions of alerts, just under 700,000 people followed links to further support" Honestly way **way** more people than I would've thought. I would've guessed closer to 700 than 700k. So these warnings are definitely doing a lot of good.
7
Zubon102 9 hr ago +1
Tens of millions of people got the warning. Most of them for completely innocent false triggers. Among them, 700,000 people thought "what the hell is this?" and clicked the link to see why on earth they were getting such a weird warning.
1
heather3113 9 hr ago +5
I worked with a sex offender in prison that once told me "how can I be a sex offender if I've never had sex". He was so twisted and even his family enabled him.
5
Sorry-Climate-7982 8 hr ago +1
How many were from trying to view the Epstein files?
1
Clbull 9 hr ago +1
I'm surprised this hasn't already been implemented widely across the web. I remember recalling a p*** site (I think it was X*******) that started blocking searches for CSAM topics a decade ago, warning their users that what they're searching for is illegal and directing them to get help. But also, I don't think mere warnings are effective. Just look at how many smokers were deterred by all the graphic warnings posted on cigarette and rolling tobacco packets.
1
v3ritas1989 9 hr ago +1
I actually stopped smoking because of that
1
Godhelptupelo 8 hr ago +1
but no consequences when they vote one of their fellow perverts into the white house...hmmph
1
Unluckypandastoo 8 hr ago +1
I triggered it by looking up the Kung fury panda 2 meme of, "Child po rn looks so cute." Or any combination of, "Child po."
1
Indie89 9 hr ago +2
Could this also be Ai scraping the net for everything? Probably unlikely, very worrying.
2
Known-Flatworm-2827 8 hr ago +1
I remember when i was first watching anime and someone mentioned lolis...naive me had no clue amd googled that, got a warning as well lmao
1
BigCrit20 8 hr ago +1
As a lifetime wargamer, I was very upset to learn what else CP stands for.
1
The_Lou_Dynassti 8 hr ago +1
Divorce lawyers crying tears of joy.
1
Talino 8 hr ago +1
Yeah, once got asked by a user why a supplier's website was blocked - [https://pulledmeats.co.uk/](https://pulledmeats.co.uk/) Don't believe these numbers mean what they think they mean
1
chovies93 8 hr ago +1
You can literally pullup a random hashtag on twitter and get CP its fucked how easy it can be
1
rensorship 8 hr ago +1
Anyone check trumps blackberry?
1
ThirdAltAccounts 9 hr ago -1
There’s more of this shit in the clear net than the dark net unfortunately
-1
MikeR_Incredible 9 hr ago +1
Good. Also, wtf? Crazy that it happens so often
1
arul20 8 hr ago +1
I for one am very happy that groups are working with tech platforms to address this problem in a science-backed way, together with potential help for p*** addicts.  Very positive!!
1
furrysalesman69 8 hr ago +1
Were they looking up a certain president’s background?
1
cyberdude419 9 hr ago +1
Holy F****** Shit! That’s a lot, way too many demented people on this planet
1
jadelink88 8 hr ago +1
I got one for trying to generate an AI picture of a troll from the LoTR movie. No, not naked or anything. The art generator had a full on freak out. It actually produced an image of a naked underage girl, totally unprompted, when I was trying to make a satyrical picture. (Trump smoking a bong with the Devil, and asking him how Epstien was). It obviously had data on who Epstein was and why this might appear in that context. Creepy.
1
Yesterday_Infinite 9 hr ago -14
Why warnings and not police?
-14
Walmartian_Beta 9 hr ago +1
So many of those warnings get triggered by weird searches - using automated word filters gets fucky sometimes - I had to help program one where I used to work and we ended up automatically banning kids for saying "grape" and similar innocent words because they contained a partial match for a forbidden word. Based on the number of warnings they had to send out, the police would stop responding pretty fast because there would be so many calls.
1
-KansasCityShuffle 9 hr ago +1
I was trying to find a news article the other day and searched “82 year old pedophile not charged thousands images” or something searchy like that and it came up with the warning. So the warning is just ridiculous and means nothing. 
1
Walmartian_Beta 9 hr ago +1
I get the purpose behind the warnings - they contain links to get people help, however, most of those predators do not want help, they don't want to change, they're evil.
1
-KansasCityShuffle 8 hr ago +1
Actually, you are right and I agree. The warnings are good and if they help people who want to be helped then it’s a good thing. I’ve always advocated for pedophiles being able to access help *before* they start preying on children.  I also think there are two different types of pedophiles.  I’m a straight woman and if no man for the rest of my life wanted to have sex with me, I just wouldn’t have sex again. I wouldn’t r*** a man to make myself feel better.  I think pedos are the same. Some are disgusting rapists who will actively seek out children and love watching them suffer, and some are like me. Just a person attracted to something they can’t control and will not act on that.  I think I’m in the minority in that thinking, though. 
1
Yesterday_Infinite 9 hr ago +1
I get that, but as a father, I would totally be ok if an investigator knocked on my door a couple days after searching "Children's grape drink" to ask me questions. For every 100 innocent people they interview, if they got one predator, we'd all sleep better at night. Obviously devoting such resources to that is unfortunately a pipe dream.
1
stumac85 8 hr ago +1
They wouldn't knock in a friendly way, they'd go in and take away eventually destroying all your digital devices. Plus the indignity of all your neighbours watching you go into the back of a police car as detectives walk out the building with computers etc. That's fine for someone is guilty but life destroying for a false positive.
1
catstone21 8 hr ago +1
As a father, I don't want cops making those kind of calls. I wouldn't trust most of them to investigate before they escalate. 
1
Cryptizard 9 hr ago +1
Because their filters are very broad. You can get a false positive warning for searching some benign terms. I think it’s also not illegal to search for something if you don’t get any results back. You aren’t actually possessing anything illegal.
1
Romeo9594 8 hr ago +1
Because the vast majority are going to be false positives due to the use of generic or multiuse keywords
1
The_Bitter_Bear 9 hr ago +1
If the filters are aggressive there likely are a lot of situations where it wasn't someone actually looking for illegal material. Be it completely innocent or not.  It doesn't help at all that for years the p*** industry has normalized using language about age of younger performers that absolutely should set off those filters even if the content itself isn't illegal.  I have to imagine it would be generating a lot of false leads or in many areas just a search likely isn't enough. Sadly law enforcement is busy enough dealing with people possessing and distributing. But, I guess it would really require knowing what sets it off. It very well could be a good thing for the warnings error on the side of caution and warning users of problematic searches vs reporting just the most blatant/vile. Might get some people to reconsider their behavior before they become an issue. 
1
LikelyAlien 8 hr ago +1
F*** a warning! Send them to the Gulag!
1
← Back to Board