• Anthropic’s new Claude 4 features an aspect that may be cause for concern.
  • The company’s latest safety report says the AI model attempted to “blackmail” developers.
  • It resorted to such tactics in a bid of self-preservation.
  • ClanOfTheOcho@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    2
    ·
    1 day ago

    Computer chips, simplified, consume inputs of 1s and 0s. Given the correct series, it will add two values, or it will multiply two values, or some other basic function. This seemingly basic functionality, done in very specific order, creates your calculator, Minesweeper, Pac-Man, Linux, World of Warcraft, Excel, and every LLM. It is incredible the number of things you can get a computer to do with just simple inputs and outputs. The only difference between these examples, on a basic, physics level, is the order of 0s and 1s and what the resulting output of 0s and 1s should be. Why should I consider an LLM any more sentient than Windows95? They’re the same creature with different inputs, one of which is specifically designed to simulate human communication, just as Flight Simulator is designed to simulate flight.

    • Plebcouncilman@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      1 day ago

      Interesting perspective, I can’t waive it away.

      I however cant help but think we have some similar “analogues” in the organic world. Bacteria and plants are composed of the same matter as us and we have similar basic processes however there’s a difference in complexity and capacity for thought that sets us apart, which is what makes animals sentient.

      Then there’s insects of whom we’re not very sure about yet. They don’t seem to think, but they respond at some level to inputs and they exhibit self preservation instincts. I don’t think they are sentient, so maybe LLMs are like insects? Complex enough to have similar behavior as sentient beings but not enough to be considered sentient?

        • Plebcouncilman@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 day ago

          Last I checked no, their nervous system was considered too simple for that. But I think I also read somewhere that a researcher had proof that bees had emotional states, so maybe I’m behind.

    • PlexSheep@infosec.pub
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      That’s just the hardware. The human brain also just has tons of neurons in the end working with analogue values, which can in theory be done with floating point numbers on computer hardware.

      I’m not arguing for LLM sentience, those things are still dumb and have no interior mutability leading to us projecting consciousness. Just that our neurons are fundamentally not so complicated that a computer couldn’t be used to do the same concept (neural networks are already quite a thing after all)