SDF Chatter
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
Adequately_Insane@lemmy.world to Memes@lemmy.ml · 1 year ago

It will only go downhill from here

lemmy.world

message-square
38
fedilink
42

It will only go downhill from here

lemmy.world

Adequately_Insane@lemmy.world to Memes@lemmy.ml · 1 year ago
message-square
38
fedilink
alert-triangle
You must log in or register to comment.
  • bleistift2@feddit.de
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    5
    ·
    1 year ago

    Isn’t it a good thing for pedophiles to have an outlet for their desires that doesn’t involve harming children? Am I not seeing an obvious downside?

    • PorkRollWobbly@lemmy.ml
      link
      fedilink
      arrow-up
      38
      arrow-down
      6
      ·
      1 year ago

      Pedophilia is not a sexuality and CSAM, AI generated or not, is not a healthy outlet. Pedophilia should be treated as a disease, and pedophiles should receive treatment for that instead.

      • idunnololz@lemmy.world
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        edit-2
        1 year ago

        AFAIK you can’t “cure” pedophilia the same way you can’t cure homosexuality. The best you can do is teach people not to act on their desires.

        • Bernie Ecclestoned@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          1 year ago

          Chemical castration?

      • bleistift2@feddit.de
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        pedophiles should receive treatment for that instead

        In a world where many people cannot afford basic healthcare or – if they can afford it – where healthcare isn’t available in the required quantity, does your argument still hold?

        • shea@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          3
          arrow-down
          6
          ·
          1 year ago

          the treatment is daily merciless beatings

    • PotatoKat@lemmy.world
      link
      fedilink
      arrow-up
      13
      arrow-down
      1
      ·
      1 year ago

      If I’m not mistaking I remember reading that consuming CSAM increases the likelihood of offense since it normalizes the act/makes the fantasies more vivid. It makes them more want to act out what they see instead of removing desires.

    • BolexForSoup@kbin.social
      link
      fedilink
      arrow-up
      11
      arrow-down
      3
      ·
      1 year ago

      deleted by creator

      • massive_bereavement@kbin.social
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        Based on this article, it seems that teens were using an app: https://www.msn.com/en-us/money/other/ai-generated-child-sexual-abuse-images-could-flood-the-internet-a-watchdog-is-calling-for-action/ar-AA1iMZj5

        Is that your reference?

        • BolexForSoup@kbin.social
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          edit-2
          1 year ago

          deleted by creator

          • massive_bereavement@kbin.social
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Stable Difussion still has some steep learning curve and requires some money investment onto hardware or cloud GPU access. Meaning they have probably several hours to re-think how stupid is what they’re doing.

            A simple app you can download into your phone and do this shit is a pretty easy and quick way of ruining two lives (probably).

            Then again, the hammer should fall onto the developers and the app store that allowed it on the first place. (IMO)

            • BolexForSoup@kbin.social
              link
              fedilink
              arrow-up
              2
              ·
              1 year ago

              deleted by creator

      • bleistift2@feddit.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        And what happens when they start making requests of real underage people?

        That’s the whole point of my argument. They don’t need to make request for real people if they can get fake ones of equal quality. Your argument reads like “We can’t let people have meat. What if they start eating live cows?”

        • BolexForSoup@kbin.social
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          1 year ago

          deleted by creator

          • Norgur@kbin.social
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            First if all: that is exactly how you treat addicts. https://harmreductionjournal.biomedcentral.com/articles/10.1186/s12954-019-0340-4

            Secondly: no, we don’t have evidence that this might decrease the danger for pedophiles to act on their desires, since the technology is rather new.

            Of course we should not enable urges like that. Yet, we have to be realistic: there will always be those that can’t be treated. Do you want those who cannot be stopped from indulging in their desires to do so on children’s images by real, abused children, or do you want them to vent on made up images?

          • bleistift2@feddit.de
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            your argument is implying that if somehow we allow them to use AI generated child porn that it will somehow stop them from seeking the real stuff out or is somehow “better.”There is literally no evidence that suggests that in the slightest.

            Of course. How would you procure such evidence? Give a group of pedophiles access to AI generated content and check if they molest children significantly less than a control group?

            Pedophilia is an illness. […] You really need to […] take a macro view of what you are arguing in favor of.

            I’m not defending pedophilia. Given that access to pedophilia treatment and prevention of sexual abuse is often lacking, I was starting a discussion of whether AI-generated content might be part of the prevention of sexual abuse of minors. After all, there are similar programs for drug abusers. Take methadone substitution as an example. Or establishments that are called “Drückerstube” in German (a very lacking translation would be “injection rooms”) – clean rooms where drug addicts have access to clean utensils for consuming drugs.

            • BolexForSoup@kbin.social
              link
              fedilink
              arrow-up
              3
              ·
              edit-2
              1 year ago

              deleted by creator

              • bleistift2@feddit.de
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                1 year ago

                […Methadone] relieves cravings and removes withdrawal symptoms. Withdrawal management using methadone can be accomplished […] or simply maintained for the rest of the patient’s life.

                https://en.wikipedia.org/wiki/Methadone (emphasis mine)

                You also ignored my other example.

                • BolexForSoup@kbin.social
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  1 year ago

                  deleted by creator

      • TheEntity@kbin.social
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        It’s still fake. But if it looks like a person in real life, what difference does the distinction make?

        I’m pretty sure there is a quite a difference between an actual human being abused and a victimless depiction of such act. Not unlike watching a violent movie. Such people obviously still need help and treatment, but to me it seems vastly better than the alternative.

        • BolexForSoup@kbin.social
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          deleted by creator

          • TheEntity@kbin.social
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            It very much might be an either/or situation for many, even if it’s not in all the cases.

            • BolexForSoup@kbin.social
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              1 year ago

              deleted by creator

              • TheEntity@kbin.social
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                Back at you. We’re both speculating.

                • BolexForSoup@kbin.social
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  edit-2
                  1 year ago

                  deleted by creator

                • LinkOpensChest.wav@lemmy.one
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  1 year ago

                  Maras and Shapiro argue that VCSAM does not prevent the escalation of pedophilic behavior. Conversely, it can progress CSAM addiction. VCSAM can fuel the abuse of children by legitimizing and reinforcing one’s views of children. The material can also be used in the groom- ing of children, reducing the inhibitions of children, and normalizing and desensitiz- ing the sexual demands

                  I removed the parenthetical citations because I’m not good at markdown, but you can find them in the linked paper.

    • klingelstreich@feddit.de
      link
      fedilink
      arrow-up
      9
      arrow-down
      4
      ·
      1 year ago

      It depends on whether you hold a world view where every person is valuable and needs help and understanding to become their best self or one where there are good and bad people and the baddies need to be punished and locked away so everyone else can live their life in peace.

      • BolexForSoup@kbin.social
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        edit-2
        1 year ago

        deleted by creator

        • Kusimulkku@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          deleted by creator

        • Kusimulkku@lemm.ee
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          involving minors

          But if it’s just generated by AI there might be no involvement

          • BolexForSoup@kbin.social
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            1 year ago

            deleted by creator

            • Kusimulkku@lemm.ee
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              I’m not saying it’s better alternative, I’m saying it might not make sense to talk about it “involving minors”.

              • BolexForSoup@kbin.social
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                1 year ago

                deleted by creator

                • Kusimulkku@lemm.ee
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 year ago

                  No if about it

                • Norgur@kbin.social
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 year ago

                  That’s not picky about wording.
                  While I agree that stuff like that should not exist at all in no way whatsoever, there is a vast difference between it existing because someone abused a child, recorded that and thus scarred the child for life, or if someone made a computer make up pixels in a way that is disgusting.

            • Norgur@kbin.social
              link
              fedilink
              arrow-up
              2
              arrow-down
              1
              ·
              1 year ago

              That’s a rather useless contribution to the discussion. The initial argument was a line of reasoning why artificial csam might be a benefit so people can vent their otherwise harmful behavior without harming actual people. You just flat out responded “it is enabling and doesn’t stop distribution”. So you just responded with “no, u wrong”. Care to tell us you reasons behind your stance?

              • BolexForSoup@kbin.social
                link
                fedilink
                arrow-up
                2
                ·
                1 year ago

                deleted by creator

                • bleistift2@feddit.de
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  “it is enabling it doesn’t stop distribution“

                  Norgur’s point is that you didn’t provide any reasoning why that should be the case.

    • AspieEgg@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      Don’t AI models need to be trained on the material they are trying to emulate?

      • Deceptichum@kbin.social
        link
        fedilink
        arrow-up
        10
        ·
        1 year ago

        No, not at all.

        That’s why people like them, you can say make me a photo of a “monkey riding a pickle in space” or “a dog made of cheese” and it’ll make it despite obviously having no reference.

        It only needs to be trained to know what things are, it can mix them freely.

        • massive_bereavement@kbin.social
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          Now I want to see said monkey…

          • Deceptichum@kbin.social
            link
            fedilink
            arrow-up
            4
            ·
            edit-2
            1 year ago

            Real

            Cartoon

            Animated

            Haiku:
            Cosmic pickle ride,
            Monkey swings through starry tides,
            Space whimsy untied.

            • massive_bereavement@kbin.social
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              Well, butter my turnips! That’s not something I expected to see today and that cartoon version will eventually find its way into a shirt, I’m telling ya.

            • ivanafterall@kbin.social
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              These are great. What did you use for the animated version?

          • Ragdoll X@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            1 year ago

            Prompt your heart out, it’s free.

            https://playgroundai.com/

            • massive_bereavement@kbin.social
              link
              fedilink
              arrow-up
              2
              ·
              1 year ago

              “Continue with Google”, thanks I’m good B-)

  • OsrsNeedsF2P@lemmy.ml
    link
    fedilink
    arrow-up
    10
    arrow-down
    1
    ·
    edit-2
    1 year ago

    On one hand, yes, but on the other, Stable Horde developed a model to detect CSAM thanks to Stable Diffusion, and that’s being used to combat pedos globally

    • bleistift2@feddit.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      deleted by creator

  • neuropean@kbin.social
    link
    fedilink
    arrow-up
    5
    arrow-down
    2
    ·
    1 year ago

    What’s interesting is that mammals from mice to dogs don’t draw a distinction between arbitrary ages before trying to copulate. On the other hand, they don’t try to fuck the equivalent of pre-pubescent members of their species either, nothing natural about that.

Memes@lemmy.ml

memes@lemmy.ml

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !memes@lemmy.ml

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.
Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 912 users / day
  • 3.46K users / week
  • 9.59K users / month
  • 23.2K users / 6 months
  • 425 local subscribers
  • 50.2K subscribers
  • 13.9K Posts
  • 310K Comments
  • Modlog
  • mods:
  • ghost_laptop@lemmy.ml
  • sexy_peach@feddit.de
  • Cyclohexane@lemmy.ml
  • Arthur Besse@lemmy.ml
  • BE: 0.19.8
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org