• Veraticus@lib.lgbtOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    we can be expressed as algorithms

    Wow, do you have any proof of this wild assertion? Has this ever been done before or is this simply conjecture?

    a thermostat also has an internal world

    No. A thermostat is an unthinking device. It has no thoughts or feelings and no “self.” In this regard it is the same as LLMs, which also have no thoughts, feelings, or “self.”

    A thermostat executes actions when a human acts upon it. But it has no agency and does not think in any sense; it does simply what it was designed to do. LLMs are to language as thermostats are to controlling HVAC systems, and nothing more than that.

    There is as much chance of your thermostat gaining sentience if we give it more computing power as an LLM.

    • barsoap@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Wow, do you have any proof of this wild assertion? Has this ever been done before or is this simply conjecture?

      A Turing machine can compute any computable function. For a thing to exist in the real world it has to be computable otherwise you break cause and effect itself as the Church-Turing Thesis doesn’t really rely on anything but there being implication.

      So, no, not proof. More an assertion of the type “Assuming the Universe is not dreamt up by a Holtzmann brain and causality continues to apply, …”.

      A thermostat is an unthinking device.

      That’s a fair assessment but besides the point: A thermostat has an internal state it can affect (the valve), is under its control and not that of silly humans (that is, not directly) aka an internal world.

      There is as much chance of your thermostat gaining sentience if we give it more computing power as an LLM.

      Also correct. But that’s because it’s a T1 system, not because the human mind can’t be expressed as an algorithm. Rocks are T0 system and I think you’ll agree dumber than thermostats, most of what runs on our computers is a T1 system, ChatGPT and everything AI we have is T2, the human mind is T3: Our genes don’t merely come with instructions how to learn (that’s ChatGPT’s training algorithm), but with instructions on learning how to learn. We’re as much more sophisticated than ChatGPT, for an appropriate notion of “sophisticated”, as thermostats are more sophisticated than rocks.

      • Veraticus@lib.lgbtOP
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        That’s a fair assessment but besides the point: A thermostat has an internal state it can affect (the valve), is under its control and not that of silly humans (that is, not directly) aka an internal world.

        I apologize if I was unclear when I spoke of an internal world. I meant interior thoughts and feelings. I think most people would agree sentience is predicated on the idea that the sentient object has some combination of its own emotions, motivations, desires, and ability to experience the world.

        LLMs have as much of that as a thermostat does; that is, zero. It is a word completion algorithm and nothing more.

        Your paper doesn’t bother to define what these T-systems are so I can’t speak to your categorization. But I think rating the mental abilities of thermostats versus computers versus ChatGPT versus human minds totally absurd. They aren’t on the same scale, they’re different kinds of things. Human minds have actual sentience. Everything else in that list is a device, created by humans, to do a specific task and nothing more. None of them are anything more than that.

        • barsoap@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Your paper doesn’t bother to define what these T-systems are

          Have a look here. Key concept is the adaptive traverse, Tn-system then means “a system with that many traverses”. What I meant with my comparison there is simply that a rock has a traverse less than a thermostat, and ChatGPT has a traverse less than us.

          They aren’t on the same scale, they’re different kinds of things.

          Addition, multiplication and exponentiation all are on the same scale, yet they’re different things. Regarding number of traverses it’s absolutely fair to say that it’s a scale of quality, not quantity.

          Human minds have actual sentience.

          Sentience as in the processing of the environment while processing your processing of that environment? Yep that sounds like a T3 system. Going out a bit on a limb, during deep sleep we regress to T2, while dreams are a funky “let’s pretend our conditioning/memory is the environment” state. Arachnids apparently can do it, and definitely all mammals. Insects seem to be T2 from the POV of my non-biologist ass.

          Everything else in that list is a device, created by humans, to do a specific task and nothing more.

          You are a device created by evolution to figure out whether your genes are adaptive enough to its surroundings to reproduce

          • Veraticus@lib.lgbtOP
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I’m giving up here but evolution did not “design” us. LLMs are designs and created with a purpose in mind and they fulfill that purpose. Humans were not designed.

            • barsoap@lemm.ee
              link
              fedilink
              arrow-up
              0
              ·
              1 year ago

              In cybernetics that’s irrelevant as the purpose of a system is what it does. I can design an algorithm that plays pong, I can write a program to evolve one, they might actually end up being identical and noone could tell.

              • Veraticus@lib.lgbtOP
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                It’s entirely not irrelevant. Even if you create a program to evolve pong, that was also designed by a human. As a computer programmer you should know that no computer program will just become pong, what an idiotic idea.

                You just keep pivoting away from how you were using words to them meaning something entirely different; this entire argument is worthless. At least LLMs don’t change the definitions of the words they use as they use them.

                • barsoap@lemm.ee
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 year ago

                  Playing pong. Inputs: ball (and possibly enemy) position, output: paddle left or right. Something like NEAT will very quickly come up with the obvious “track the ball” approach using just as many AST nodes as you would.