• danny161@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    8
    ·
    1 day ago

    That’s unfortunately (not really sure) probably the fault of Germanys approach to that. It is usually not taking these websites down but try to find the guys behind it and seize them. The argument is: they will just use a backup and start a “KidFlix 2” or sth like that. Some investigations show, that this is not the case and deleting is very effective. Also the German approach completely ignores the victim side. They have to deal with old men masturbating to them getting raped online. Very disturbing…

    • Schadrach@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      They have to deal with old men masturbating to them getting raped online.

      The moment it was posted to wherever they were going to have to deal with that forever. It’s not like they can ever know for certain that every copy of it ever made has been deleted.

    • Dr. Moose@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      15 hours ago

      I used to work in netsec and unfortunately government still sucks at hiring security experts everywhere.

      That being said hiring here is extremely hard - you need to find someone with below market salary expectation working on such ugly subject. Very few people can do that. I do believe money fixes this though. Just pay people more and I’m sure every European citizen wouldn’t mind 0.1% tax increase for a more effective investigation force.

      • lennivelkant@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        Most cases of “we can’t find anyone good for this job” can be solved with better pay. Make your opening more attractive, then you’ll get more applicants and can afford to be picky.

        Getting the money is a different question, unless you’re willing to touch the sacred corporate profits…

      • Geetnerd@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        13 hours ago

        Discovery of this kind of thing is as old as civilization.

        Someone runs their mouth, or you catch someone with incrimination evidence on them. Then you lean on them to tell you where to go.

      • Ledericas@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        13 hours ago

        they probably make double/triple in the private sector, i doubt govt can match that salary. fb EVEN probalby paid more, before they starte dusing AI to sniff out cp.

        • Dr. Moose@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          13 hours ago

          I’m a senior dev and tbh I’d take a lower salary given the right cause tho having to work with this sort of material is probably the main bottle neck here. I can’t imagine how people working this can even fall asleep.

    • TheProtagonist@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      ·
      edit-2
      1 day ago

      I think you are mixing here two different aspects of this and of similar past cases. I the past there was often a problem with takedowns of such sites, because german prosecutors did not regard themselves as being in charge of takedowns, if the servers were somewhere overseas. Their main focus was to get the admins and users of those sites and to get them into jail.

      In this specific case they were observing this platform (together with prosecutors from other countries in an orchestrated operation) to gather as much data as possible about the structure, the payment flows, the admins and the users of this before moving into action and getting them arrested. The site was taken down meanwhile.

      If you blow up and delete)such a darknet service immediately upon discovery, you may get rid of it (temporarily) but you might not catch many of the people behind it.

    • recall519@lemm.ee
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 day ago

      This feels like one of those things where couch critics aren’t qualified. There’s a pretty strong history of three letter agencies using this strategy successfully in other organized crime industries.

      • Geetnerd@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        13 hours ago

        Like I stated earlier, someone was caught red-handed, and snitched to get a lesser sentence.

    • taladar@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 day ago

      Honestly, if the existing victims have to deal with a few more people masturbating to the existing video material and in exchange it leads to fewer future victims it might be worth the trade-off but it is certainly not an easy choice to make.

      • Geetnerd@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        13 hours ago

        Well, some pedophiles have argued that AI generated child porn should be allowed, so real humans are not harmed, and exploited.

        I’m conflicted on that. Naturally, I’m disgusted, and repulsed. I AM NOT ADVOCATING IT.

        But if no real child is harmed…

        I don’t want to think about it, anymore.

        • quack@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 hours ago

          Understand you’re not advocating for it, but I do take issue with the idea that AI CSAM will prevent children from being harmed. While it might satisfy some of them (at first, until the high from that wears off and they need progressively harder stuff), a lot of pedophiles are just straight up sadistic fucks and a real child being hurt is what gets them off. I think it’ll just make the “real” stuff even more valuable in their eyes.

          • 🎨 Elaine Cortez 🇨🇦 @lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 hours ago

            I feel the same way. I’ve seen the argument that it’s analogous to violence in videogames, but it’s pretty disingenuous since people typically play videogames to have fun and for escapism, whereas with CSAM the person seeking it out is doing so in bad faith. A more apt comparison would be people who go out of their way to hurt animals.

        • ZILtoid1991@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          11 hours ago

          Issue is, AI is often trained on real children, sometimes even real CSAM(allegedly), which makes the “no real children were harmed” part not necessarily 100% true.

          Also since AI can generate photorealistic imagery, it also muddies the water for the real thing.

        • Ledericas@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          ·
          13 hours ago

          that is still cp, and distributing CP still harms childrens, eventually they want to move on to the real thing, as porn is not satisfying them anymore.

        • misteloct@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 hours ago

          Somehow I doubt allowing it actually meaningfully helps the situation. It sounds like an alcoholic arguing that a glass of wine actually helps them not drink.

      • yetAnotherUser@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        21 hours ago

        It doesn’t though.

        The most effective way to shut these forums down is to register bot accounts scraping links to the clearnet direct-download sites hosting the material and then reporting every single one.

        If everything posted to these forums is deleted within a couple of days, their popularity would falter. And victims much prefer having their footage deleted than letting it stay up for years to catch a handful of site admins.

        Frankly, I couldn’t care less about punishing the people hosting these sites. It’s an endless game of cat and mouse and will never be fast enough to meaningfully slow down the spread of CSAM.

        Also, these sites don’t produce CSAM themselves. They just spread it - most of the CSAM exists already and isn’t made specifically for distribution.

        • taladar@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          18 hours ago

          Who said anything about punishing the people hosting the sites. I was talking about punishing the people uploading and producing the content. The ones doing the part that is orders of magnitude worse than anything else about this.

          • yetAnotherUser@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            3
            ·
            14 hours ago

            I’d be surprised if many “producers” are caught. From what I have heard, most uploads on those sites are reuploads because it’s magnitudes easier.

            Of the 1400 people caught, I’d say maybe 10 were site administors and the rest passive “consumers” who didn’t use Tor. I wouldn’t put my hopes up too much that anyone who was caught ever committed child abuse themselves.

            I mean, 1400 identified out of 1.8 million really isn’t a whole lot to begin with.

              • yetAnotherUser@discuss.tchncs.de
                link
                fedilink
                English
                arrow-up
                1
                ·
                6 hours ago

                Not quite. Reuploading is at the very least an annoying process.

                Uploading anything over Tor is a gruelling process. Downloading takes much time already, uploading even more so. Most consumer internet plans aren’t symmetrically either with significantly lower upload than download speeds. Plus, you need to find a direct-download provider which doesn’t block Tor exit nodes and where uploading/downloading is free.

                Taking something down is quick. A script scraping these forums which automatically reports the download links (any direct-download site quickly removes reports of CSAM by the way - no one wants to host this legal nightmare) can take down thousands of uploads per day.

                Making the experience horrible leads to a slow death of those sites. Imagine if 95% of videos on [generic legal porn site] lead to a “Sorry! This content has been taken down.” message. How much traffic would the site lose? I’d argue quite a lot.