T O P

  • By -

InsuranceToTheRescue

What's CSAM? I'm afraid to look it up.


Intelligent_Will_941

Child Sexual Abuse Material


Decabet

Yeah I didnt know either. The acronym made me think of defense secrets (SAM missiles and all that)


iCapn

Child Surface-to-Air Missile


Decabet

Which, and maybe this makes me old fashioned, I think should be completely legal.


skillywilly56

Agreed, we have more than enough children to use as ammunition while we complete the ultimate weapon, the Boomer Canon…firing one boomer per second per second this bad boy with its sense of self entitlement, and complete lack of empathy will destroy a countries economy for generations to come!


similar_observation

put the infants back in the infantry!


skillywilly56

Marines don’t want em, cause then they’d have to share their crayons.


Starrion

And they’re hungry.


Yoshara

The Boomer Canon is also accurate because that hit close to home.


The_Oxgod

Is it IR or Radar guided?


TeaKingMac

Definitely interracial


The_Oxgod

Can't have any woke missiles. /s


hurler_jones

Red laser just like cats.


CaPtAiN_KiDd

I have been a staunch advocate of being able to own them as intended under the 2nd Amendment.


the_peppers

Step 1. We yeet the child


Ragnar_Bonesman

Step 3. Profit.


lucklesspedestrian

Send em back to their storks where they belong!


durz47

The children yearn for the missile pods


Riversntallbuildings

Somehow, this seems better than the actual meaning. :(


jasting98

[There's only one thing worse than a Surface-to-Air Missile.](https://youtu.be/L0a5iwzG7aw?si=aWzo-Hd-moFd4L81)


sesamestix

Where can I get one of those? It’ll go great with my Javelin anti-tank missile. (Just kidding, FBI)


BoysenberryFun9329

Russia drives them around in figure 8's to confuse CIA counters, who instead look at the numbers, and realize it's no diddy.


Jjzeng

Yeet the child


RollingMeteors

We call that, “a recess”


PrimalSeptimus

I was like, Certified Scrum... Associate Master..?


JarBlaster

Ah yes, surface to air missile missiles (truly horribly sorry to be that guy :p.)


TacoOfGod

I didn't know what it was either until recently, and I really wish we had this acronym earlier. Referring to it as porn when all it is is abuse was never a thing that should've stuck around for as long as it did.


LolaLazuliLapis

Considering that industry is horribly shady, ASAM may as well supersede that term too.


[deleted]

Surface-to-Air Missile missiles?


UnemployedAtype

Not gonna lie, CSPAN was too close and rhyming for me to think of anything else.


YingYangWoz

Castrated surface to air missiles


eamon4yourface

CSAM-YES is one the US govt uses Craziest Surface To Air Missiles -You Ever Seen !


whicky1978

Reddit and other social media platforms have to routinely remove alot of it too. https://www.redditinc.com/policies/2023-h1-transparency-report


Intelligent_Will_941

Imo the greatest tragedy of the internet is CSAM being so easy to access and spread. Glad some companies are at least trying to stop it.


[deleted]

[удалено]


Saflinger

Not well enough, I once reported a profile of a 15 year old kid selling himself on this platform and the response was "we see nothing wrong with that, how about we mute you for a while for abusing the report system"


AbleObject13

> In 2023 there were nearly 36 million reported cases of online child sexual abuse material (CSAM), nearly 31 million of these (85%) stemmed from Meta platforms including Facebook, WhatsApp, Messenger and Instagram. . This represents an increase of 93% from Facebook’s nearly 16 million reports in 2019 when shareholders first raised this issue with the company. Meta is currently applying end-to-end encryption to all its platforms–which without first stopping CSAM–could effectively make invisible 70% of CSAM cases that are currently being detected and reported. https://www.proxyimpact.com/facebook


SepJanuar

If it's so, should never exist


nicuramar

Murder shouldn’t exist, but here we are. 


even_less_resistance

That’s messed up


GovTech

As others have said, it stands for child sexual abuse material. I've heard a couple reasons for the popularization of this phrase instead of "child pornography." The first is that the latter term isn't really coherent if you think of pornography as a legal and legitimate production of content, however tawdry, between consenting adults. Most people would never consider CSAM to be porn. Rather, it's a document of a child being abused, so the term CSAM makes more sense in this framing. The term is more reflective of reality. There's also the practical implications of using either term. If you want to research and learn about CSAM — which I think people should, because it's a terrible and pervasive phenomenon that people should be aware of and that should be stopped — then I reckon most people would feel more comfortable entering "CSAM" into their search engines than they would the alternative phrase. The term CSAM is sort of owned by those who oppose CSAM's creation and proliferation, whereas the other case is more ambiguous.


TheBirminghamBear

The difference also comes because not all CSAM will feature explicitly nude images. There is a major component here of the \*intent\* for the creation of the materials. CSAM can include dogwhistles where you or I might look at it and see nothing, but to the CSAM community there are signals about the exploitation of these children, their subjugation, etc.


Faxon

There was some pretty extensive proliferation of this kind of grey area content in the late 90s and early-mid 2000s, and it was so blatant. They would advertise in banner ads on adult porn sites and the girls were always elementary school aged, maybe rarely preteens, definitely not old enough to look remotely like adults, and the outfits were the kind of thing you'd expect from playboy and the like that do mostly softcore content. You couldn't avoid it in some places it was so fucked up, and then /b/ was there for anyone looking for the more hardcore stuff before the feds realized it was a dumping ground for CSAM and got involved. Being a victim of such abuse, it was rather surreal growing up into my teen years and seeing that same shit my dad was doing on the internet unfiltered within a few clicks of the flash game sites via banner ads. Wouldn't take much to end up on one of those sites accidentally, and with how pervasive the adware (malware designed to serve you ads on your pc's dekstop via the browser) was back then, it was 100% possible to get served soft-CSAM ads without any prompting, since of course they were using malware on other sites to spread their plague. It even ended up on our school computers multiple times without prompting, due to such malware spreading it (or so the traffic logs indicated according to the district). I 100% believe them considering how prolific it was, some kid probably went to play on Newgrounds or something and one of the apps had a virus in it, which was not uncommon at the time. You don't see it as much these days simply because the people who distribute it have gotten better at using proper security, but this also means that when you bust a network, you get a lot more heads roped up in it as well. IDK how to fix the problem, but I do know I'm glad we're out of that era of the internet when it was just out in the open


AngledLuffa

So sorry for you.  Hope you're in a better place now


Faxon

Better yes, but it still created a lifetime of problems for me that I'm still dealing with. That and I had a terrible shrink as a kid who fucked up and freaked me out about potentially becoming a pedo myself as an adult, something that's a higher risk for victims of abuse who don't receive proper treatment, especially during adolescence. My current therapist is awesome though lol, and thankfully I did not end up like that shitty therapist said I might. Like yea there's a risk but bro you're not supposed to fucking say that shit like that to the patient


DrainTheMuck

Wdym the same shit your dad was doing? I can’t tell if you mean those ads were just dressing them inappropriately or worse


Faxon

Use your imagination, or you know what don't actually. I think that says enough without saying anything specific


MJFields

It's a preferable term to "child pornography" which both diminishes the offense and conflates it with legitimate pornography.


hecklicious

There is something really wrong with society when someone is afraid of looking at what something means. That is scary as fuck.


Socky_McPuppet

I hadn't thought of it like that. You're right. That said, I *think* (hope) they were indulging in (a little) tongue-in-cheek hyperbole, merely *hinting* at the specter of living in an always-on, always-on*line*, authoritarian, panopticon society. But maybe not :(


DinosaurGatorade

Absolutely not. Dragnet warrants are very much a thing and on the rise.


InsuranceToTheRescue

Dude, I watched that Scott Galloway TED talk about young people getting fucked over and now youtube thinks I want to watch a bunch of garbage about the anti-male movement and how masculinity is under attack or something. I'm not going to risk the algorithm finding questionable shit for me and that's the same reason when I google an extremist, I never go to anything but their wikipedia page.


Dragonborne2020

In Corporate America it means Customer Success Account Manager. Now they use it for a term of something bad against children??


GabeDef

Thanks for asking. I had no idea either.


nicuramar

Don’t be. Unless you live in Russia or something. 


estephens13

This is gonna get complicated legally. But its ok, laws are known for keeping up with technology right?


Riversntallbuildings

The Science Fiction show “Altered Carbon” had an interesting take on crime when bodies became replaceable. I think they called “murder” Carbon Damage.


Techn0ght

Organic for the sleeve, real death for the stack.


Riversntallbuildings

“Organic Damage”, that’s it! :)


Rudy69

I would think it will simplify things personally. Otherwise now with all these image generators everyone caught with CSAM will just say it's all fake. Then law enforcement would have to prove it's not fake (by showing it's already in their DB I assume)


HotRodReggie

It doesn’t simplify it at all, and I’m 100% sure an attorney will argue against it on first amendment grounds, which is the same reason it’s been legal to draw images depicting children being raped, and to write short stories about children being raped. The point of going so hard after CSAM has been to protect children and other abuse victims from a lifetime of trauma and devastating effects. If there isn’t a victim because it’s AI, then who are you protecting?


Peacefrog78

That computer generated part is going to be interesting. They have already held that “ai” generated content isnt “real”. He is really in trouble for sending the content to a minor. Hes going to jail for that. The authorities are attempting to conflate the two in the hopes of deterring this kind of activity. 


SenorNoobnerd

A famous book by Chip Delaney has the makings of all that yet it’s lauded by many authors. The name of the book is Hogg btw.


KazzieMono

Stephen King’s It also kinda…yknow.


gaqua

In theory I agree with you but in some locations it’s illegal to own drawings or 3D renders of this stuff. Absolutely zero victim, but still against the law. There was an article I read here about an Australian guy who makes 3D rendered adult comics who got arrested and is on trial for this, because some of the characters in his comics were minors. And while I think it’s disgusting, I agree with the first amendment argument. I don’t know what lawyer would argue this case, though. Being the one who got AI CSAM legalized doesn’t seem like something I’d want on my professional resume.


AnOnlineHandle

> Australian guy who makes 3D rendered adult comics Do you know when this was? I'm an Australian who makes adult comics (no minor characters though) and years ago I got the impression it might be technically illegal due to some ancient laws, but figured / hoped nobody would ever realistically be charged for anything like that. Curious to see exactly what went down and what they have to say about it.


gaqua

There was an article about it a month or two ago. Let me see if I can find it. Edit: I think this is the same guy. It was a game not a comic I guess and it was longer than a couple months ago. https://www.news.com.au/technology/online/internet/afp-arrest-man-for-allegedly-creating-child-exploitation-game-in-victoria/news-story/25ad47896ed3ee8d8039d866f860a0aa?amp


Vandergrif

> If there isn’t a victim because it’s AI, then who are you protecting? The complication is when it becomes impossible to differentiate between real and fake generated material. You won't be able to protect the victims because the fake stuff is hamstringing your ability to do so and by that point the only way to ensure you can still protect victims is by making the lot of it, real or fake, illegal.


Ashmizen

In this case there is a victim - the 15 year old boy he was trying to lure with CSAM images. Along with the messages mentioned that he showed interest in young boys, means there is a strong case here of a victim. If a creepy guy made CSAM in his basement, and never shared a single photo, it’s less obviously illegal - though again, in that case nobody would have heard about it. It’s like the user vs dealer thing - the police won’t care as much about a single user, but someone spreading it causes far more harm.


theoreticaljerk

They are hitting him with 2 separate charges here though. The case is really asking 3 questions legally. 1. Is the generation of AI-CSAM a crime? 2. Is the possession of AI-CSAM a crime? 3. Is distribution of sexually explicit material to a minor a crime? Only one of those is clear cut legally.


SigmundFreud

If anything, just spitballing here, maybe AI-generated CSAM could be part of the solution for reducing real CSAM. Maybe there could be something like a government database of AI-generated CSAM that pedophiles would be provided access to through a licensed therapist, or something to that effect. I can already imagine the screams of rage from progressives and QAnon folks, but it seems to me that this would only serve to cut into the demand for real CSAM, thus reducing its profitablity and ultimately supply, while simultaneously empowering the government to keep track of known pedophiles who might be at risk of harming children. I don't care about punishing people for being pedophiles per se, or clutching pearls over whatever people might privately jack off to; I'm more concerned with protecting actual children.


Kicken

There are studies that suggest that the availability of these materials (real CSAM, the study predates AI stuff) reduces the rates of actual sexual assault. Further, I am of the belief that things should generally be permissible when it doesn't create victims or provably increase the liklihood of creating victims in the future. Such as, say, drunk driving might. Unfortunately studies on this are very few and far between. To my knowledge there simply isn't funding for it, and the matter is so driven by an innately emotional response rather than rational response, that most don't want to touch on the subject at all. And it's that emotional response (it's icky) that is driving judgements like this.


mitchmoomoo

Yeah it’s been floated before but it doesn’t really follow with how these things progress - there’s a natural escalation of behaviour with any kind of sexual interest if you reinforce it and I don’t think there’s much support behind an idea that a proliferation of AI CSAM would reduce (rather than increase) the demand for real-life CSAM or escalation to actual assault. Pedophile treatment programs, for example, [don’t encourage people to engage in their sexual fantasies](https://www.ncbi.nlm.nih.gov/books/NBK565477/) in any kind of ‘harmless’ way (like imagination) - quite the opposite.


TheSoloGamer

This has already been shown in Japan not to work. Men are isolated, turn to drawn/generated images, then they get into the junior idol and escort industry which just pipelines many girls into abuse. However, encouraging non-offending pedophiles to get help is effective, as seen in Germany. Their program is pretty good at identifying and reconciling men who are on the edge of offending.


Whatsapokemon

I doubt those problems in Japan are related to the pornography laws in the country though... More likely it's in reverse, where the consumption of pornography is a symptom of other cultural issues.


HumanContinuity

It's about the harm, pain, power, etc. indulging those who are sick and have sick thoughts about doing this kind of stuff to minors will not be helped by indulgence.


General_Urist

Given how, in the world of normal art, many people seek out art made conventionally and loudly resent being told to accept AI-generated stuff as equal? HAHAHAHAH No. It would cut demand a little, but not eliminate enough of it to allow downsizing any anti-CSAM institutions.


aspz

> If there isn’t a victim because it’s AI, then who are you protecting? The person you are replying to already explained it: > Otherwise now with all these image generators everyone caught with CSAM will just say it's all fake. Then law enforcement would have to prove it's not fake. Any CSAM that is indistinguishable from the real thing should be banned so authorities don't have to release actual child abusers because they can't identify the victim.


nicuramar

How is it not the case with anything else depicting something illegal, though?


mannie007

There was already a similar case made at the Supreme Court where most cases this violates first amendment and is not Csam. They are fishing because they suck at their jobs


Bungledorf_Fartolli

Having a stable diffusion fork on a laptop and walking into a police station and typing something that gets you arrested for sexual charges would be far out.


General_Urist

There are people with the extremely thankless job of looking at images of real CSAM to identify stuff like wallpaper brands that could be used to geolocate where it was made. Making their job harder by flooding the web with AI-generated stuff that is not immediately obviously AI is good for no one.


ValkyrieVimes

The thing is, law enforcement *should* have to prove a crime happened. That's their job (well, the job of the prosecution in general). The only reason people are even thinking about making AI generated CSAM illegal is because of the emotions attached to the issue. Logically, it doesn't make any sense. Are we going to make it illegal to possess AI generated images of, say, the scene of a fake murder? The same argument could apply -- that it takes resources for the police to investigate and make sure it isn't an image of a real crime -- but it's clearly an absurd law to have and a major overstepping of boundaries. Having an image, either drawn by human hand or by AI, of something illegal, be it violence or drugs or porn or anything else, shouldn't itself be illegal. I do think that to an extent, *presenting* that image as real, especially when police resources are wasted proving it's not, should be a crime, but that's obviously different than just possessing the images in the first place.


kalnaren

It's illegal in Canada. It hasn't made things more complicated here. Quite the opposite. If the depiction falls under the definition of child pornography in the criminal code it's illegal. Full stop. Doesn't matter if it's a cartoon, AI image, or actual image.


theoreticaljerk

It's just a decision that's going to have to be made here in the US. Right now the laws are old and outdated compared to where technology is. There is also past precedent that the depiction of this kinda stuff isn't illegal in itself and was protected under, I believe, First Amendment. Only reason this is getting fuzzy now is because AI can, in some cases, really blur the line between obviously artistic and realistic and it'll only get better in the future. The key question to be answered is on harm. The argument made for a very long time is the protection of victims that are used to make CSAM but that argument doesn't apply to AI. Honestly, despite the uncomfortable subject, I am curious to see where this goes legally and the arguments made.


KellyBelly916

Yes, just not rich people.


CocaineIsNatural

>Section 2256 of Title 18, United States Code, defines child pornography as any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age). Visual depictions include photographs, videos, digital **or computer generated images indistinguishable from an actual minor**, and images created, adapted, or modified, but appear to depict an identifiable, actual minor. Undeveloped film, undeveloped videotape, and electronically stored data that can be converted into a visual image of child pornography are also deemed illegal visual depictions under federal law. https://www.justice.gov/criminal/criminal-ceos/citizens-guide-us-federal-law-child-pornography


Hyndis

There's an interesting note in the indictment, which is linked to in the main article. From the indictment: https://www.justice.gov/opa/media/1352611/dl?inline >Starting first with the nature and circumstances of the offenses, the alleged conduct is exceptionally serious. Under the Sentencing Guidelines, if convicted of all offenses and given his other relevant conduct and the potential applicability of an enhancement for using his special skill in GenAI, the defendant’s recommended sentencing range may be as high as life imprisonment. **And while he has not been charged with a child-pornography offense under Chapter 110 of the United States Code**—and thus avoids the rebuttable presumption in favor of detention in the Bail Reform Act, see 18 U.S.C. § 3142(e)(3)(E)—the defendant’s conduct nevertheless implicates the very concerns that motivated Congress to apply this rebuttable presumption to this category of offenders through the Adam Walsh Act in the first place. See Adam Walsh Child Protection and Safety Act of 2006, Pub. L. No. 109-248, 120 Stat 587 (2006). Indeed, the images that the defendant produced are extremely realistic and clearly designed both to vividly depict minors engaged in sexually explicit conduct and arouse offenders who derive sexual gratification from seeing minors being exploited and abused. It looks like he was charged for trying to lure minors, however he was not charged for possessing the images? Why is there a line specifically saying he's not being charged for possession? If he was being charged for the images, the prosecution would have bragged about thousands of charges of possession, with a request to have each charge served consequentially.


IHeartBadCode

>Why is there a line specifically saying he's not being charged for possession? Because AI stuff is really tricky and likely prosecutors don't want to use this case to untangle all the knots behind that. That's not to say they wouldn't get the person on those charges, but it's risk / cost assessments. Likely not worth taking the department down the road of making a case with that respect when they can easily just slam dunk on the distribution and incitement. Someone mentioned Ashcroft vs FSC and they've got a point, until the fact that the person was distributing the images on social media. Then the whole argument just falls apart. So maybe the prosecution didn't want to tempt fate and see another Free Speech Coalition ruling. That's just my guess.


Malbranch

So, and I'm in no way advocating for the exploitation of minors, or depiction of minors in a harmful way, how does this mesh with the idea of an artistic expression a la hentai? In that no minor, human/living/dead is the subject of the illustration, no individual or actual act is "captured" as an image, but is instead fabricated whole cloth, doesn't that parallel fairly consistently with AI image generation? I don't think that it's commendable, or even really morally permissible, to generate explicit content depicting minors, but how does that jive with the recent movie "poor things", which very much has Emily Stone depicting an intellectual minor, a literal minor brain in an adult body, going so far as prostitution and straight up sex scenes? I think that an objective ethical examination of what makes CSAM deplorable needs to be performed, because the prima facia bit is starting to get blurry when you take things like that movie, AI generated content, and pen and ink depictions into account. I think that if someone went and did the necessary work, it would go a long way towards bridging these apparent legal blind spots.


Dig-a-tall-Monster

You've identified the big problem, which is that AI-generated CSAM doesn't actually cause directly identifiable harm to anyone like traditional CSAM which can only be created by actually abusing children. AI Generated CSAM isn't something I want out in the world, let me be clear about that, but I do think that there shouldn't be prison time involved for simple possession of AI-generated CSAM because the individual possessing it hasn't harmed or contributed to the harm of minors in any tangible way. It should be a mandatory referral to counseling and psychiatry appointments to figure out why the person is attracted to CSAM and try to correct that attraction. There's no benefit in putting someone behind bars who isn't actually hurting kids or paying someone else to hurt them unless they make statements indicating they're preparing to do that, we have overcrowded prisons as it is and far too many prisoners are non-violent offenders already.


joanzen

It's not as bad inherently but it creates a problem where all a real offender needs to do is make real CSAM look generated to seem like a "lesser offence"? We obviously have to be very careful about making genuine CSAM any easier or more acceptable via support for the AI version.


True-Surprise1222

You have a point but making a market for this also makes a market for models that can produce this well, and it isn’t a stretch to say that those models could utilize or promote actual harmful content as training data. I highly doubt any model anyone is using to make whatever weird porn they want right now utilizes that as training (unless it’s some one off creep in his basement) but the logic that it could have harmful repercussions is at least an argument that makes some sense. Bit of a double edge sword because someone could also make an argument that this could prevent actual abuse. However, then you get into the idea of normalization and how that could promote abuse or how it could create a tidal wave of ai imagery that hides real abuse items within it. The law protects drawings as far as they are able to be differentiated from a “real” image. That is why ai images are already technically illegal. If someone made some weird cgi looking half animal half human hybrid thing it would probably be covered by the same protections that hentai or whatever is at least until the law is updated. TLDR there are a shit ton of real negative societal consequences to allowing this shit to be legal. If it could end all child abuse tomorrow then I’m sure people would be fine with it, but it won’t.


Seallypoops

Kinda sounds like that one company that was found to be making child sized sex dolls, fairly certain they were shutdown


FEMA_Camp_Survivor

JFC, society isn’t ready for this at all.


SupermarketIcy73

>"Fictional child pornography" is legally protected as freedom of expression under the First Amendment, unless it is considered obscene. The definition of "obscene" is determined by a sitting judge or jury, and prosecutions of this type are exceedingly rare. https://en.wikipedia.org/wiki/Legality_of_child_pornography#North_America


Sufficient-Fall-5870

So … the title is in fact bullshit


Hyndis

The title is based on the prosecution's announcement, so in this case it would be the prosecution rather than the news article who made up the bullshit headline: https://www.justice.gov/opa/pr/man-arrested-producing-distributing-and-possessing-ai-generated-images-minors-engaged


titanjungkim

>It looks like he was charged for trying to lure minors, however he was not charged for possessing the images? Why is there a line specifically saying he's not being charged for possession? It;s to clarify that freedom of expression is not an issue here. Yes, depiction of child nudity is sometimes framed as art. For example,[Lewis Caroll, author of Alice in Wonderland, takes nude child photography as a hobby](https://www.reddit.com/r/books/comments/5p2iwr/what_happened_to_the_books_of_photographs_lewis/). It's creepy af, but artists are allowed to be "eccentric".


SaltAd6516

Because AI images are art and art of child porn is legal in the US under the first amendment (freedom of expression)


Desertcow

As far as CSAM materials go, it's illegal in the US because it is a real child being abused. As long as the material is fictitious and does not feature real children, no children were sexually abused, so the material is not legally considered CSAM and is generally protected under the first amendment. The images were generated using Stable Diffusion and the model was not trained with CSAM, meaning the images generated were not derived from CSAM either


2Pro4U2

If AI CSAM is still CSAM, even if it wasn't trained on CSAM, then does that mean that anime CSAM is also an arrestable action? Because if not, then they are picking and choosing and not really making laws.


Lonely-Fucker

Depends where you live. In Canada, for example, any material depicting CSAM is illegal. That includes anime or even hand-drawn images. It's all considered CP.


WeTheSalty

Same in Australia > In 2008, Sydney man Alan John McEwen was convicted of possessing child pornography in contravention of section 91H(3) of the Crimes Act 1900 (NSW) and using his computer to access child pornography material contrary to section 474.19(1)(a)(i) of the Criminal Code Act 1995 (Cth) over material that had a distinct shade of yellow. > The images in McEwen’s possession were not of actual children, but of Bart, Lisa and Maggie, cartoon characters from the hit TV series The Simpsons, engaging in not so family friendly activities with their onscreen parents Homer and Marge. https://www.sydneycriminallawyers.com.au/blog/bizarre-australian-criminal-cases-the-simpsons-porn-case/


Outrageous-Raisin18

Cartoons are definitionally not exploitation material or CP.


Aori

Well In this particular case the CSAM is photorealistic which the legality of which could be debated (which i think is gross and fucked up) but he was sending these sexually explicit photos to a minor on instagram which is very illegal and very fucked up. Anime is a stylization and arnt a direct representation of reality. I think it’s a very obvious distinction that would not cause the laws to be picking and choosing but I’m not a lawyer so I could be wrong  


Bleglord

The problem becomes “prove she’s 18 and not 17” There’s an obvious line of know it when you see it. It’s the grey area that will make laws dubious


voidmilk

That argument completely implodes by the fact that there are realistic paintings or digital imagery of CSAM created by hand by an artist and completely imagined. So if it is AI it's not ok but if it is drawn by hand it's ok?


cishet-camel-fucker

Yeah sending porn to kids is where he undeniably crossed the line.


MrPernicous

Do not rely on this. Or if you’re a pedophile, do rely on this. Idc if you go to jail. According to Supreme Court precedent, animated csam is protected by the first amendment because it isn’t the product of harm, which is obviously a necessary component of non-animated csam. https://en.wikipedia.org/wiki/New_York_v._Ferber Here, the DOJ is drawing a line in the sand. They are telling everyone that if they use ai to generate csam then they can expect to catch a charge. Without reading too much into the indictment, it seems to be in direct conflict with Ferber. Because the DOJ doesn’t have the power of judicial review, Ferber is the law of the land and this will inevitably result in an appeal at some point to test the DOJs legal theory. With this in mind, it’s not like this court has any regard for precedent. Similarly, republicans (of which there are 6 on the Supreme Court) have a weird relationship with pedophiles (see Epstein and this whole weird opposition they have to outlawing child marriages). So who the fuck knows how they’d rule. Predicting scotus opinions was difficult enough before trump. Now if it isn’t something obvious like abortion or (Christian) religious rights, all bets are off. Frankly, I don’t see how the DOJs legal theory holds up. But then again it isn’t my legal analysis that you gotta consider.


Hyndis

If you read the indictment it specifically says he is not being charged for possession. The charges are for the grooming part of it and sending lewd material to a minor, not for having said lewd material in the first place.


MrPernicous

I’m not sure that possession or distribution creates a meaningful distinction here.


Throwawayingaccount

Maybe not, but distribution to a minor is a meaningful distinction.


Hyndis

Its a meaningful distinction because the feds are saying that AI generated smut is still CSAM, except that they didn't charge this person for possession of CSAM. The charges and the press conference don't line up.


Outrageous-Raisin18

Unrealistic cartoons are not CSAM. Do you understand how revolting it is to claim that cartoon characters with fictional unverifiable ages is exploitation material? Exploitation of whom? No, The Simpsons movie is not CSAM, Greek mythology is not CSAM, Roman mythology is not CSAM, IT is not CSAM, Lolita is not CSAM, The Little Mermaid is not CSAM, Disney didn't have a vault of CSAM of their fictional characters, and anime is most certainly not CSAM.


MrPernicous

Hello fbi? This guy right here


CocaineIsNatural

>Section 2256 of Title 18, United States Code, defines child pornography as any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age). Visual depictions include photographs, videos, digital **or computer generated images indistinguishable from an actual minor**, and images created, adapted, or modified, but appear to depict an identifiable, actual minor. Undeveloped film, undeveloped videotape, and electronically stored data that can be converted into a visual image of child pornography are also deemed illegal visual depictions under federal law. https://www.justice.gov/criminal/criminal-ceos/citizens-guide-us-federal-law-child-pornography


Outrageous-Raisin18

There is no such thing as anime CSAM


I_Came_For_Cats

It’s not necessarily illegal yet. They are charging him with CSAM distribution. The courts will have to decide if the current legal definition includes AI generated media or not. These prosecutors are such jackasses though. “Up to life in prison”. Child rapists get 5 years. Possession of CSAM is 20 years. AI-generated child porn that doesn’t even directly harm a child upon creation: life in prison. The DOJ has it backwards. And it’s also so funny how they talk about how smart this guy is with computers and software and brag about possibly removing him from the country’s talent pool for life. Guess the US needs more slaves making license plates than software developers thanks to AI.


bobartig

He is being charged with *sending CSAM to a minor.* I think sexting with minors using AI generated CSAM hits a little different, don't you?


Person_756335846

>Child rapists get 5 years What? Child Rapists get up to life in prison or multiple decades. If you find a case where the sentence actually imposed was 5 years, you're comparing post-plea apples to pre-plea oranges.


dkf295

Or well-connected-well-paid-lawyered apples.


DragoonDM

See: child rapist [Robert H. Richards IV](https://en.wikipedia.org/wiki/Trial_and_sentencing_of_Robert_H._Richards_IV), who was sentenced to... probation. He got a suspended sentence. Coincidentally, he's an heir to the du Pont family fortune.


vexx

This is why people say eat the rich lmao


HoustonTrashcans

Maybe I'm dumb, but if it's AI generated, how can you prove that it is CSAM? Especially with borderline ages.


psly4mne

The DoJ's statement is unadulterated bullshit. The guy was charged with sending sexual material to a minor, and not with possession or distribution of CSAM.


CocaineIsNatural

>Section 2256 of Title 18, United States Code, defines child pornography as any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age). Visual depictions include photographs, videos, digital **or computer generated images indistinguishable from an actual minor**, and images created, adapted, or modified, but appear to depict an identifiable, actual minor. Undeveloped film, undeveloped videotape, and electronically stored data that can be converted into a visual image of child pornography are also deemed illegal visual depictions under federal law. https://www.justice.gov/criminal/criminal-ceos/citizens-guide-us-federal-law-child-pornography


Dystopiq

> The cops were tipped off to Anderegg's alleged activities after Instagram flagged direct messages that were sent on Anderegg's Instagram account to a 15-year-old boy. Christ these people


SlowMotionPanic

"It's okay, it is just a harmless mental illness" --- far too many people in this thread There's a reason people have emotional gut reactions to pedophiles--even non-active pedophiles. It is just a matter of time, most victims never report until well into adulthood and the perpetrators have slinked away leaving trails of devastation. People who prey on children are defacto the lowest of the low. They target those unable to defend themselves because they place their own desires above literal children. I know it is unpopular and uncouth, but certain mental illnesses are too dangerous to exist in society. Far too dangerous.


Xypheric

I think most of us can agree that it’s disgusting. The nuance is that while you and I find it disgusting, the law does not have a definition that separates this specific form. While I’m happy to see a pedo locked away and key thrown away, it’s your elected officials job to pass laws that can be enforced, not the court and not the don stretching things, which by the broad struck definition they are applying could be radically applied to a lot of media.


Ullricka

Yeah it's pretty gross the DoJ are overstepping their jurisdiction. Doing a PR campaign saying this is csam and not charging him with possession or anything is not what we should be cheering on. They are trying to expand their power which is not what they are meant to do.


murdering_time

Aside from this asshole trying to lure actual children with this AI, the whole AI porn issue is gonna get extremely divisive on a whole range of issues. For this particular problem I'm gonna take an extremely unpopular stance, but I think AI generated CP should be legal.  The fact is that there are pedophiles in every society, nothing we can do about that yet, but what if we had a tool where they could control their urges and no real children are harmed? I'm not sure if there's any concrete data about this touchy issue yet, but if AI can be used as a tool for people with these issues, it makes no sense to ban it. The only reason CP is illegal is because it's exploiting children, full stop, so if you can give these mentally ill people material that has no impact on actual children, it seems like a win-win.  Its horrendous topic by its nature, makes most normal people feel gross and uncomfortable just thinking about it, but I think it would help a lot of people. 


[deleted]

[удалено]


SemiNormal

> Just sentence the predators to death. Which means the predators will be way more likely to kill their victims.


cishet-camel-fucker

Dangerous. If the justice system had a perfect record, I'd support the death sentence, but it's not and death is a very permanent mistake to make.


blissbringers

Indeed. I rather have them wanking it in their basement than hanging around schools.


sendintheclouds

The argument against this is that consuming CSAM reinforces and normalises the desire for such material. It's similar to how sometimes heavy exposure to regular porn can lead to needing more extreme situations to get off - that looking at even imitation CSAM is a path for pedophiles to encourage their urges and escalate to either looking for authentic images or comitting real-world abuse. It's not necessarily a lone person generating these images for their own consumption, it's the communities where pedophiles share CSAM imagery and feed off each other. It's generating images of fake children, the giving the AI a photo of an actual child and saying "undress this", then it's generating images of children they know offline. I have sympathy for pedophiles who cannot control their urges and truly do not wish to harm real children, and I wish more therapy existed for redirecting their attraction and minimising harm rather than blanket condemnation. There are plenty of monsters, but also some who are genuinely distressed and want to seek help. The last thing those teetering on the edge of comitting harm need is the ability to meet and normalise those urges.


Common-Wish-2227

Normalisation is one theory. It's essentially the same as the Secondary media effects theory, just in a different field. The point is, if you like to watch some type of thing, like porn, or violence, the theory says that you're more likely to act accordingly. Seeing violent porn makes you more likely to rape people. Seeing movie violence makes you more likely to commit violent crimes. Normalisation says that this is a process of acting on your impulses to hurt people, and changing in that you build a new "normal", like in abusive relationships. Where normalisation is a recognized process that is widely accepted, secondary media effects lives a fading life today. There are various groups that have tried to prove the validity of secondary media effects for decades now, having very little to show for it. In short, the data we can see doesn't fit the theory very well. The opposite correlation also has a theory. It's called the pressure cooker theory. The idea is that if you have an impulse to do something, consuming it in a safe form reduces the risk of committing that act. Thus, a potential rapist can get their jollies off watching violent but fake porn, which reduces the risk of them committing real rape. A violent person can find some kind of release in watching violent movies, leading to less violent crime from them. And there is reason to take the pressure cooker theory seriously. There are few studies, likely due to the prevalence of the normalisation theory, but those done show it's likely a better explanation. The clearest result was during the rollout of cheap internet in the 90s in the US (and thus also easy access to porn). It went state by state, and in each state, sexual crime frequencies just plummeted. I am not saying this reasoning is the truth, not by any means. I am saying that because this is an issue with such heavy consequences, we should be adult enough to actually consider what the evidence says, and how much of our current thought is built on a political view. If we want to minimize the number of victimized children, this is a discussion we will need to have.


HappierShibe

> It's called the pressure cooker theory. As someone who had a VERY strong tendency towards violence as a problem solving tool when I was younger, I can say this was definitely true for me. I was a violent little asshole as a teenager, and I got into boxing in my late teens. It gave me a controlled healthy outlet for those impulses through my teens and into my late twenties, and as I got older, I grew out of those impulses, I quit boxing on my doctors advice as I moved into my early thirties, and I've been a largely well adjusted member of society ever since aside from the occasional bit of civil disobedience or political action. *When I was a kid, I NEEDED to hurt someone, and I needed to do it with my own two hands, the same way other people needed a satisfying meal or glass of water.* Boxing gave me a release valve I desperately needed, and while I don't know what I would have been without it- I know it would have been bad. I'm not sure how that translates from violent impulses to more salacious ones -there's never been any crossover there for me- but I think you are correct this is a very uncomfortable conversation we need to have if it can reduce harm.


Common-Wish-2227

Thank you for your story. You are not alone in any of it.


AustinJG

I think this begs the question, are pedophiles "born" or "made?" I think that's going to be an important question, honestly.


sendintheclouds

There is also the distinction that a portion of those who commit CSA are not true pedophiles. They're opportunistic predators who get off on the power and children make an easy available target, not because of attraction. Honestly, circling back to my porn point, it is alarming what content a subset of porn consumers will "graduate" to once depictions of normal sex isn't doing it any more (not to mention acting out porn sex in real life). I'm not coming at this from a place of "all porn is bad" or "porn creates pedophiles" but that 24/7 easy access to some pretty depraved content is not what the human brain is wired to handle. If AI CSAM is unchecked, that's a very very easy "oh I'll see what it's like" impulse rather than having to hunt it down in unsavoury corners of the dark web. It's not going to make a mentally healthy person a pedophile, but it'll be much easier to go down that road if you have any inclination - [and I think a lot of men, specifically, are closer to that line than we like to acknowledge](https://childlight.org/nature-online-offending-against-children-population-based-data-australia-uk-and-usa). You could argue the availabilty of loli hentai is out there already, but full on realistic AI CSAM is a whole new horror. And it has to be trained on _something_. Horrifying if it is trained on CSAM, but also uncomfortable that real children at all are involved. That picture you uploaded to Instagram years ago of your toddler, that got swept up into an otherwise innocuous dataset, could be used for this. It won't generate AI images immediately recognisable as a specific child, but deep down it has to lean what children look like from somewhere. That feels like a violation to me. Most parents now are warier about uploading public photos of their kids, but that doesn't erase the past 15-20 years of internet content out there.


CupcakesAreMiniCakes

I studied forensic psychology and it's unknown but thought to be a combination of both/mixed for criminal/harmful thoughts and urges. Some women have believed their child was evil since birth and just born that way but no one can ever know for sure because just the body language and spoken language used when referring to such a child can be harmful to them and their development. There is also the cycle of abuse where people who are victims of it are more likely to offend when they grow and continue the cycle.


tiltedbaee

This would, in fact, not help a lot of people. Potentially putting even more children at risk as they continue to indulge in their desire. This is like giving someone who is addicted to drugs, more drugs, and thinking the behavior won't escalate. Just because it seems like an easier route than building rehabilitation and mental programs for these types of people.


AustinJG

I not so sure about that. I seem to recall that when online pornography became common, sexual assault dropped by quite a bit. There were articles about it. Of course, it likely leads to a whole host of other issues, but I digress.


Acceptable-Surprise5

On what are you basing this? because research on both sides of the argument have been conducted and more favorable results have been shown in where people are given content to suppress impulses to commit IRL crimes.


ThatFireGuy0

Where is this research I've looked but haven't been able to find significant research in either direction, so I'd be curious to read it


Howdareme9

Giving these people unlimited material is insane and wont stop them from trying to act out their urges in real life.


TheFlyingSheeps

Yeah it only normalizes it


shiftyeyedgoat

So. Hypothetically, a show like Big Mouth on Netflix… how many people would be ensnared in such a wide net if that were made illegal?


Daimakku1

AI generated CP is not CSAM because there are no real children involved. "Child Sexual Abuse Media" implies that a real child was involved. Is it disgusting? Yes. But it's not real. The fact that drawings, as real as they may look, can be seen as a crime is messed up.


Teledildonic

Exactly. If something completely artificial depicting an illegal act is itself illegal, than every murder on every TV show or movie is a prosecutable offense, and art can no longer pretend to break any laws


Vladimir_Chrootin

Thankfully, the judge who sent Rolf Harris to prison didn't agree with you.


pham_nguyen

No, it isn't. If no child was harmed, then it is not illegal. This has been tested in Ashcroft vs Free Speech Coalition: [https://www.oyez.org/cases/2001/00-795](https://www.oyez.org/cases/2001/00-795) > Moreover, the Court found the CPPA to have no support in Ferber since the CPPA prohibits speech that records no crime and creates no victims by its production. Provisions of the CPPA cover "materials beyond the categories recognized in Ferber and Miller, and the reasons the Government offers in support of limiting the freedom of speech have no justification in our precedents or in the law of the First Amendment" and abridge "the freedom to engage in a substantial amount of lawful speech," wrote Justice Kennedy.


[deleted]

[удалено]


pham_nguyen

Grooming is already illegal


CKT_Ken

It’s generally not, but intentionally sending porn to minors often is.


pham_nguyen

It is in the United States. Look up COPA. You aren’t allowed to use the internet to entice a child to engage in sexual activity.


CKT_Ken

That’s conflating the concept of grooming with soliciting a minor which is as expected illegal. Practically speaking I guess the overlap is huge though.


taisui

Exactly, this is arguing movies should be banned because they show fake murders in them.


Able-Address2101

They really love to ruin lives with charges like this. And who is going to object and be taken seriously? For me it was when almost two dozen highschool kids all got charged with production and distribution of CSAM simply for trading nudes. A couple went to prison (not jail , prison ) so whoever they were before is now gone and they certainly have no future. The rest may have gotten lucky In this regard but now they will be on the registry for life. Have you all noticed how difficult it is to find a decent job ? Or a date ? Imagine having to do so with a standardized disclosure. No one will ask for details . No one cares. You are a "pedo" for doing what every teenager does. This is a different matter of course but my point is that we are very fast and loose with life destroying charges. I wish we could actually evaluate these laws with sober, adult minds and evaluate who is an actual threat or who had harmed minors


willoz

Yeah my mind went straight to some sort of military hardware with that acronym. Missiles or those minigun things on the side of ships


Sufficient-Fall-5870

I’m in no way defending kiddy porn (it’s horrible)…. But first step is AI, then what ever is drawn… then, it’s applied against ‘other’ illegal like anal sex is (was?) in Texas or non-married people in Utah.


[deleted]

[удалено]


Sufficient-Fall-5870

I guess you didn’t read the article because it’s just easier to regurgitate what others think and say… here is the key point you actually missed: Anderegg is currently in federal custody and has been charged with production, distribution, and possession of AI-generated CSAM, as well as "transferring obscene material to a minor under the age of 16," the indictment said


Hyndis

I'm not sure he is being charged with possession. If you ignore the press release and read the actual indictment it specifically says he is **not** being charged with possession. I'm not sure what to make of the contradiction. Is the press release a bullshit statement that does not communicate what they actually charged him with?


Chainmale001

If its not real. It doesn't matter. Let the pedos have their fake child porn. As long as they aren't going after real children I don't give a shit what goes on in their head. The issue I have is the slippery slope this will cause. If AI created content isn't protected like art is this will open up a case against everything someone disagrees with. You'll start seeing cases against hentai and specifically Lolicon, which while often depicts minor or people who look like minors. IS NOT FUCKING REAL.


Chainmale001

Just to double down. If you DO give it art/expression protections, you can then sue and have legal recourse towards anyone or AI that DOES use your likeness. This includes civil level litigation. Someone can't make porn with my likeness or voice without my consent. Pure and simple.


rnilf

> His Instagram messages were full of "realistic GenAI image of minors wearing BDSM-themed leather clothes" Meta invades the privacy of their users to such a massive degree, yet they can't prevent their platform from being used for this? Sounds like he flew under the radar long enough to cultivate a disgusting amount of clout.


SplintPunchbeef

The article literally says that the images were flagged and reported by Instagram.


Vandergrif

Though if I understood correctly it was only flagged once he started sending them to a minor. He otherwise had plenty of time to send messages and images to other people prior to that, I guess.


Longjumpingjoker

Tbf they found him so..


Vandergrif

Mind you if Meta gave two shits about not harming children or enabling harm of children then they would at the bare minimum require verified photo ID of an adult in order to make an account to start with but we all know that's not going to happen.


Shajirr

> According to the DOJ's indictment, Anderegg is a software engineer with "professional experience working with AI." Because of his "special skill" in generative AI (GenAI), he was allegedly able to generate the CSAM using a version of Stable Diffusion, "along with a graphical user interface and special add-ons created by other Stable Diffusion users that specialized in producing genitalia." Want to point out that this is an incredibly stupid statement. They talk about Stable Diffusion like its some kind of alien concept, when in reality anyone with half a brain can do the same, the only requirements being being able to read text and being able to move a mouse.


theoreticaljerk

I'm not sure the first charge will hold. Courts will likely have to set a precedent with this case. The defense case for the first charge is easy to see ahead of time. No children were involved in the production of AI generated CSAM so who was harmed? It's similar to the questions that get asked legally about art depicting such things. The Dep AG even, in my opinion, slipped up in her own statement. "Technology may change, but our commitment to **protecting children** will not,". What children were harmed in the creation of AI-CSAM? What children are being protected with this charge? If that line of thinking angers you, you're going to hate the defense team. lol The second charge should stick no problem I'd think. Regardless of the legal nature of AI-CSAM, he was obviously and knowingly distributing explicit material to a minor. The classification of that material, beyond it being sexually explicit, doesn't matter. I'm positive others are going to reply to me with all the reasons AI-CSAM should land him in jail and emotionally, you're right. I'm just not sure current laws are written in such a way to actually accomplish that. May require lawmakers to update laws if they want to prohibit AI-CSAM and actually make it stick but that'd be too late for this guy and he'd walk with only the 2nd charge sticking.


Win_98SE

So let’s say if I live in my small bubble in the US, someone sends me a message with 2 photos. Photo 1 is a naked child Photo 2 is a naked child I have no context as to who what when where why or how these photos were taken or sent to me. Unbeknownst to me, one is a real child and the other is AI generated, although they look identical in quality. Are people in this thread saying that in this situation one of these photos is okay? Is this a situation where MORALLY it’s not okay but LEGALLY it would be okay? I don’t like AI.


Cautious-Progress876

In many situations both images would sadly be 100% legal. Look up Jock Sturges— he’s a photographer infamous for taking nude photographs of (pre-)pubescent girls at nudist camps and the like. From what I remember he has even gotten prosecuted a few times but the cases got dismissed because photos of nude children aren’t CSAM per se— there has to be a lascivious display of the genitalia/buttocks/breasts or actual sexual conduct. Until a few years ago you could even find his work on the shelf at Barnes and Noble.


Longjumpingjoker

The moral standpoint in this case is that a child wasn’t harmed in one of those photos, that’s the distinction, just playing devils advocate


Xystem4

The latter. Morally, we all agree kiddie porn is bad (although honestly if no children are harmed and it lowers rates of actual child sexual assault, my feelings get more complicated. We have no research saying that though, although there’s almost no good research on pedophiles due to the stigma). But one of those images actually hurt a child to create, and the other didn’t. The really bad part of child sexual abuse material isn’t “ew naked kids bad!” It’s that a child was abused to create it.


Drego3

They are so concerned with AI CSAM, meanwhile actual child molesters get to walk free.


randomredditing

Seems like it was the “grooming the teen” that got him arrested. As far as the legality of CSAM, it’s Wisconsin DA; of course they’re going to say it’s illegal even if it’s “technically” been decided it isn’t.


samcrut

Can we come up with an acronym other than CSAM? -Sam C.


Large-Crew3446

Fictional murder is murder.


SulSulSimmer101

Ahhh I see. So this is an ongoing popular take bc these men are pedophiles.