• elshandra@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    9
    ·
    8 months ago

    Interesting take, addiction to the convenience provided by AI driving the need to get more. I suppose at the end of the day it’s probably the same brain chemistry involved. I think that’s what you’re getting at?

    I’m any case, this tech is only going to get better, and more commonplace. Take it, or run for the hills.

    • treefrog@lemm.ee
      link
      fedilink
      English
      arrow-up
      22
      ·
      8 months ago

      No, harm reduction would be recognizing that an object as causing harm, that people will use that object anyway, and doing what we can to minimize the harms caused by that use.

      It’s less about addiction and brain chemistry than simple math. If harm is being caused, and it can be reduced, reduce it.

      • elshandra@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 months ago

        Ah, so more like self-harm prevention, gotcha.

        I guess like any tool, whether it is help or harm depends on the user and usage.

        • treefrog@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          ·
          8 months ago

          right, so I think the person’s point was that microsoft is helping to manufacture the harm, and warning that the harm is there, but not doing much to actually reduce the harm.

          • elshandra@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            8 months ago

            Oh, right. Microsoft is a corp. They don’t care about the harm they do until it costs them money.

            e: also, I love to bash on ms, but they’re not the problem here. These things are being built all over the place… In companies, in governments, in enthusiasts back yard. You can tell Microsoft, Google, Apple to stop developing the code, you can tell nvidia to stop developing cuda. It’s not going to matter.

            • treefrog@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              ·
              8 months ago

              I just heard a news report on OpenAI developing technology to make deep fakes easier. They realized this could cause harm. So they’re only releasing it to a few educational institutions.

              This is harm reduction. And I realize corporate ethics is something of an oxymoron. But something along these lines was what the original person was meaning by a harm reduction approach by microsoft. If they’re aware their technology is going to cause harms to democracy, they have an ethical duty to reduce those harms. Unfortunately, corporations often put ethical duties to increase shareholder value first. That doesn’t mean they don’t have other ethical responsibilities too.

              • elshandra@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                8 months ago

                I suppose, could be harm reduction. Like peeling a bandaid off slowly instead of ripping it off.

                They’re here, they might not be everywhere yet, but they’re here to stay as much as photoshopped images or trick photography are. Just more lies to hide the truth.

                All we can do now is get better at dealing with them.

                • treefrog@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  8 months ago

                  I hear you about it just being an evolution of the propaganda machine. And I think it’s going to reveal cracks in the system. That it’s going to rip the bandaid off faster than climate change which is the slow peel we’re all dealing with already.

                  Harm reduction would be investing money in government regulation. Lobbying for government regulation. Usually this is seen as a disaster for business, but in this case it would throttle competitors too. And possibly save a lot of lives. Because this sort of automated propaganda is going to create a lot of fascist regimes all over the planet. Propped up by the illusion of democracy.

                  More so than it already is.

    • Admiral Patrick@dubvee.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 months ago

      I’m heading for the hills then. I’m perfectly capable of thinking for myself without delegating that to some chatbot.

      • elshandra@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 months ago

        Everyone is. As time and tech progresses, you’re going to find that it becomes increasingly difficult to avoid without going off-grid entirely.

        Do you really think corps aren’t going to replace humans with AI, any later than they can profit by doing so? That states aren’t going eventually to do the same?