jesus this is gross man

  • HedyL@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    13 hours ago

    As I’ve pointed out earlier in this thread, it is probably fairly easy to manipulate and control people if someone is devoid of empathy and a conscience. Most scammers and cult leaders appear to operate from similar playbooks, and it is easy to imagine how these techniques could be incorporated into an LLM (either intentionally or even unintentionally, as the training data is probably full of examples). Doesn’t mean that the LLM is in any way sentient, though. However, this does not imply that there is no danger. At risk are, on the one hand, psychologically vulnerable people and, on the other hand, people who are too easily convinced that this AI is a genius and will soon be able to do all the brainwork in the world.

    • diz@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      11 hours ago

      I think this may also be a specific low-level exploit, whereby humans are already biased to mentally “model” anything as having an agency (see all the sentient gods that humans invented for natural phenomena).

      I was talking to an AI booster (ewww) in another place and I think they really are predominantly laymen brain fried by this shit. That particular one posted a convo where out of 4 arithmetic operations, 2 were “12042342 can be written as 120423 + 19, and 43542341 as 435423 + 18” combined with AI word-salad, and he was expecting that this would be convincing.

      It’s not that this particular person thinks its genius, he thinks that it is not a mere computer, and the way it is completely shit at math only serves to prove it to them that it is not a mere computer.

      edit: And of course they care not for any mechanistic explanations, because all of those imply LLMs are not sentient, and they believe LLMs are sentient. The “this isn’t it but one day some very different system will” counter argument doesn’t help either.