• squiblet@kbin.social
    link
    fedilink
    arrow-up
    17
    ·
    edit-2
    1 year ago

    Fediverse is a bunch of independent websites potentially connected by compatible software, not one entity, so there’s not really a basis for comparison. You could ask about individual instances. But also it’s about “failing to cooperate with a probe into anti-child abuse practices”, not hosting or failing to moderate material. Australian law says they can asks sites about their policies and they have to at least respond.

    • blazera@kbin.social
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      The article has their response. Given their warning to google as well, apparently the responses also have to be good enough for them.

      • squiblet@kbin.social
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        They said

        X’s noncompliance was more serious, the regulator said, including failure to answer questions about how long it took to respond to reports of child abuse, steps it took to detect child abuse in livestreams and its numbers of content moderation, safety and public policy staff.

        So yes, all the questions need to be at least addressed and probably saying “we don’t do that because Elron doesn’t care about it” wouldn’t suffice either.

            • blazera@kbin.social
              link
              fedilink
              arrow-up
              1
              arrow-down
              4
              ·
              1 year ago

              because we’ve gone in a circle of me asking if the site we are on right now is doing anything better with regards to this problematic material, since folks seem to care about Twitters failure to address it themselves. You respond that it’s not about their lack of addressing the material, but they’re lack of a response to the regulatory inquiry. I point out that they did respond, and your response is that oh they actually need to have a good answer of how they are addressing the material. Which is the same premise as the article and what my first comment was about. It’s hypocrisy, because the standard isnt being applied to the fediverse, no one is up in arms about our lack of automatic detection of problematic material or surveillance of private messaging. Because we care about privacy when we’re not being blinded by well intentioned Musk hate.

              • squiblet@kbin.social
                link
                fedilink
                arrow-up
                4
                ·
                edit-2
                1 year ago

                I posted from the article that they didn’t respond to several questions:

                X’s noncompliance was more serious, the regulator said, including failure to answer questions about how long it took to respond to reports of child abuse, steps it took to detect child abuse in livestreams and its numbers of content moderation, safety and public policy staff.

                I speculated that probably they also need adequate responses, but that’s not what the article or the fine is about.

                If one of the individual sites in the Fediverse was asked by Australian regulators, I bet they’d respond fully. It’s not quite the same situation as Twitter, either - none of these sites are large enough to require many staff members, and don’t have their own live streaming platform.