It seems crazy to me but ive seen this concept floated on several different post. There seems to be a number of users here that think there is some way AI generated CSAM will reduce Real life child victims.

Like the comments on this post here.

https://sh.itjust.works/post/6220815

I find this argument crazy. I don’t even know where to begin to talk about how many ways this will go wrong.

My views ( which are apprently not based in fact) are that AI CSAM is not really that different than “Actual” CSAM. It will still cause harm when viewing. And is still based in the further victimization of the children involved.

Further the ( ridiculous) idea that making it legal will some how reduce the number of predators by giving predators an outlet that doesnt involve real living victims, completely ignores the reality of the how AI Content is created.

Some have compared pedophilia and child sexual assault to a drug addiction. Which is dubious at best. And pretty offensive imo.

Using drugs has no inherent victim. And it is not predatory.

I could go on but im not an expert or a social worker of any kind.

Can anyone link me articles talking about this?

  • ZILtoid1991@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    To those, who say “no actual children are involved”:

    What the fuck the dataset was trained on then? Even regular art generators had the issue of “lolita porn” (not the drawing kind, but the “very softcore” one with real kids!) ending in their training material, and with current technology, it’s very difficult to remove it without redoing the whole dataset yet again.

    At least with drawings, I can understand the point as long as no one uses a model or is easy to differentiate between real and drawings (heard really bad things about those doing it in “high art” style). Have I also told you how much of a disaster it would be if the line between real and fake CSAM would be muddied? We already have moronic people arguing “what if someone matures faster that the others”, like Yandev. We will have “what if someone gets jailed after thinking their stuff was just AI generated”.

    • Chozo@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      Even regular art generators had the issue of “lolita porn” ending in their training material

      Source? I’ve never heard of this happening. I feel like it would be pretty difficult for material that’s not easily found on clearnet (where AI scrapers are sourcing their training material from) to end up in the training dataset without being very intentional.

  • pixxelkick@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Boy this sure seems like something that wouldn’t be that hard to just… do a study on, publish a paper perhaps? Get peer reviewed?

    It’s always weird for me when people have super strong opinions on topics that you could just resolve by studying and doing science on.

    “In my opinion, I think the square root of 7 outta be 3”

    Well I mean, that’s nice but you do know there’s a way we can find out what the square root of seven is, right? We can just go look and see what the actual answer is and make an informed decision surrounding that. Then you don’t need to have an “opinion” on the matter because it’s been put to rest and now we can start talking about something more concrete and meaningful… like interpreting the results of our science and figuring out what they mean.

    I’d much rather discuss the meaning of the outcomes of a study on, say, AI Generated CSAM's impact on proclivity in child predators, and hashing out if it really indicates an increase or decrease, perhaps flaws in the study, and what to do with the info.

    As opposed too just gesturing and hand waving about whether it would or wouldn’t have an impact. It’s pointless to argue about what color the sky outta be if we can just, you know, open the window and go see what color the sky actually is…

    • Nonameuser678@aussie.zone
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      I love your enthusiasm for research but if only it were that easy. I’m a phd researcher and my field is sexual violence. It’s really not that easy to just go out and interview child sex offenders about their experiences of perpetration.

  • Hanabie@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    The way I see it, and I’m pretty sure this will get downvoted, is that pedophiles will always find new material on the net. Just like actual, normal porn, people will put it out.

    With AI-generated content, at least there’s no actual child being abused, and it can feed the need for ever new material without causing harm to any real person.

    I find the thought of kiddie porn abhorrent, and I think for every offender who actually assaults kids, there are probably a hundred who get off of porn, but this porn has to come from somewhere, and I’d rather it’s from an AI.

    What’s the alternative, hunt down and eradicate every last closeted pedo on the planet? Unrealistic at best.

  • Meowoem@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Your statement is ‘i don’t know what I’m talking about but I have strong options’ that’s understandable but if we really care about harm reduction then it has to be an evidence based and science backed policy.

    I have no idea what the right thing to do is but I want whatever helps mitigate risk and harm.

  • Fal@yiffit.net
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    AI CSAM is not really that different than “Actual” CSAM

    How do you not see how fucking offensive this is. A drawing is not really different from a REAL LIFE KID being abused?

    It will still cause harm when viewing

    The same way killing someone in a video game will cause harm?

    And is still based in the further victimization of the children involved.

    The made up children? What the hell are you talking about?

    Some have compared pedophilia and child sexual assault to a drug addiction

    No one sane is saying actually abusing kids is like a drug addiction. But you’re conflating pedophilia and assault. When it’s said pedophilia is like a drug addiction, it’s non offending pedophiles that is being discussed. Literally no one thinks assaulting kids is like a drug addiction. That’s your own misunderstanding.

    Can anyone link me articles talking about this?

    About what exactly? There’s 0 evidence that drawings or fantasies cause people to assault children.

      • xigoi@lemmy.sdf.org
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        Generated by a computer by looking at some normal porn as well as (non-sexual) pictures of chiidren and trying to combine those?

    • JackGreenEarth@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      I don’t get it, it seems many people want to condemn all forms of child porn, seemingly to avoid downvotes, because for some reason the internet community can’t see that AI generated images don’t harm anyone.

  • pinkdrunkenelephants@lemmy.cafe
    link
    fedilink
    arrow-up
    0
    ·
    11 months ago

    People like that are pedo apologists and the fact that they’re not being banned from the major Lemmy instances tells us all we need to know about those worthless shitholes.

  • Killing_Spark@feddit.de
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    1 year ago

    I’m just gonna put this out here and hope not to end up on a list:

    Let’s do a thought experiment and be empathetic with the human that is behind the predators. Ultimately they are sick and they feel needs that cannot be met without doing something abhorrent. This is a pretty fucked up situation to be in. Which is no excuse to become a predator! But understanding why people act how they act is important to creating solutions.

    Most theories about humans agree that sexual needs are pretty important for self realization. For the pedophile this presents two choices: become a monster or never get to self realization. We have got to accept that this dilemma is the root of the problem.

    Before there was only one option of getting a somewhat middleway solution: video and image material which the consumer could rationalize as being not as bad. Note that that isn’t my opinion, I agree with the popular opinion that that is still harming children and needs to be illegal.

    Now for the first time there is a chance to cut through this dilemma by introducing a third option: generated content. This is still using the existing csam as a basis. But so does every database that is used to find csam for prevention and policing. The actual pictures and videos aren’t stored in the ai model and don’t need to be stored after the model has been created. With that model more or less infinite new content can be created, that imo does harm the children significantly less directly. This is imo different from the actual csam material because noone can tell who is and isn’t in the base data.

    Another benefit of this approach has to do with the reason why csam exists in the first place. AFAIK most of this material comes from situations where the child is already being abused. At some point the abuser recognises that csam can get them monetary benefits and/or access to csam of other children. This is where I will draw a comparison to addiction, because it’s kind of similar: people doing illegal stuff because they have needs they can’t fulfill otherwise. If there is a place to get the “clean” stuff, much less people would go to the shady corner dealer.

    In the end I think there is an utilitarian argument to be made here. With the far removed damage that generating csam via ai still deals to the actual victims we could help people to not become predators, help predators to not repeat, and most importantly prevent or at least lessen the amount of further real csam being created.

    • Surdon@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      Except there is a good bit of evidence to show that consuming porn is actively changing how we behave related to sex. By creating CSAM by AI, you create the depiction of a child that is mere object for the use of sexual gratification. That fosters a lack of empathy and an ego centric, self gratifying viewpoint. I think that can be said of all porn, honestly. The more I learn about what porn does to our brains the more problematic I see it