A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • echo64@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    4
    ·
    8 months ago

    Yeah it’s this shit I’m talking about. We have a whole legal and justice system to deal with this. No kne needs to accept sexual abuse as a new normal. This shit is weird.

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      4
      ·
      edit-2
      8 months ago

      I’m not saying there shouldn’t be consequences for someone who is spreading these pictures with the intention to cause harm to someone’s reputation but it’s incredibly naive to think that the justice system is going to stop deepfakes when it can’t even prevent bike theft. 12 year olds are making these with their smartphones. The technology is extremely accessible and easy to use and that is not going to change. I’m sorry but you’re not putting the toothpaste back into the tube. Wait a few years and you can generate photorealistic porn videos of anyone you want.

      • echo64@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        6
        ·
        8 months ago

        We can’t stop biketheft so fuck off women, your free game coz this guy said so.

          • echo64@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            4
            ·
            8 months ago

            You might want to look up what strawmanning means. I’m just flat out mocking what you said.

    • Dkarma@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      8 months ago

      No we don’t. What is happening here is not covered by current laws.

    • 0x0@programming.dev
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      10
      ·
      edit-2
      8 months ago

      Sexual abuse?

      Child pornography involves molesting a child and is a crime, as it should be.

      Fake nudes have been a thing for ages and are only an issue if the targeted party takes offense. It may be slander but it’s certainly not sexual abuse.

      No one is accepting sexual abuse so drop it down a notch, Karen.

      • sbv@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        edit-2
        8 months ago

        only an issue if the targeted party takes offense.

        Deep fakes can change how the victim is treated by other people. Especially other kids.

        Upthread, someone states

        we’re just going to need to accept as a new normal.

        Which sounds a lot like accepting this kind of shit, regardless of what you call it.

      • eatthecake@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        8 months ago

        From another comment:

        To people who aren’t sure if this should be illegal or what the big deal is: according to Harvard clinical psychiatrist and instructor Dr. Alok Kanojia (aka Dr. K from HealthyGamerGG), once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves. There’s also extreme risk of feeling depressed, angry, anxiety, etc. The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

        Try to imagine watching a realistic video of yourself being abused, imagine your mother watching. That will absolutely fuck some people up, and a lot of those victims are going to be children. Shit is going to get bad.

        • 0x0@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          I wouldn’t put actual non-consensual pornography and fake pornography of any kind in the same bag but, geez, I’m not a Dr.

          Deep fakes do improve on the (technical) realism over 90s photoshop for sure. Doesn’t that still qualify as slander? (Also not a lawyer.)

          • eatthecake@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 months ago

            The question is why, with an internet full of porn, do men want non consensual pornography that they know women are opposed to. It’s as if the hurtfulness, the lack of consent and the control over the woman in the video are actually the point.

            • 0x0@programming.dev
              link
              fedilink
              English
              arrow-up
              1
              ·
              8 months ago

              Non-consensual pornography is called rape and it’s a crime in most of the world.