• Milk_Sheikh@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    92
    arrow-down
    5
    ·
    2 days ago

    Your Brain on ChatGPT

    …LLM users displayed the weakest connectivity. Cognitive activity scaled down in relation to external tool use… LLM users also struggled to accurately quote their own work. While LLMs offer immediate convenience, our findings highlight potential cognitive costs. Over four months, LLM users consistently underperformed at neural, linguistic, and behavioral levels.

    Outsourcing thinking from your brain to an AI literally makes you dumber, less confident in the output, and teaches you nothing.

    Call me a Luddite or a hater, but if you’re one of the people who uses AI as a shortcut to actual thought or learning, I will judge you and disregard your output and opinions. Form your own basis of understanding and knowledge instead of a teaspoon deep summary that is frequently incorrect.

    • Soup@lemmy.world
      link
      fedilink
      arrow-up
      19
      arrow-down
      4
      ·
      2 days ago

      They say that, when making an Anki deck, using it is only half the battle because a lot of the learning comes from the act of making it yourself. That advice is older than these LLMs and it really showcases a big reason why they suck. Personally, I haven’t even used autocorrect since 2009.

      Being a luddite I feel requires having a highly abstinence-only approach. Knowing what is worth off-loading and what is worth doing yourself is just being smart. I’m really glad that I don’t need to know every detail of modern life but I still take a lot of pride in knowing how quite a lot of it works.

        • Scranulum@feddit.nu
          link
          fedilink
          arrow-up
          3
          ·
          24 hours ago

          No autocorrect? Pfft, filthy casuals. I haven’t even stricken a line through a word since the Carter administration.

        • Soup@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          2
          ·
          19 hours ago

          Oh for sure, I won’t argue that, but it does explain my point. Even when I use a program with the squiggly red line I correct it myself so that I can reinforce the correct spelling.

    • SugarCatDestroyer@lemmy.world
      link
      fedilink
      arrow-up
      9
      arrow-down
      3
      ·
      edit-2
      2 days ago

      I remembered a movie about the future where some guy couldn’t figure out how to insert the right shape into a hole and he tried to insert a cube into a round hole, I don’t remember exactly, but it’s not so funny when it becomes reality… In any case, due to excessive comfort or convenience, the human brain, so to speak, adapts in the bad sense of the word to what is easier.

      • Carrot@lemmy.today
        link
        fedilink
        arrow-up
        5
        ·
        1 day ago

        LLMs are less replacing the need for log tables, and more replacing the need to understand why you need a log table. Less replacing a calculator and more replacing the fundamental understanding of math. Sure, you could argue that it doesn’t matter if people know math, and in the end you might be right. But given that ChatGPT can and will spit out random numbers instead of a real answer, I’d rather have someone who actually understands math be designing buildings, people who actually understand anatomy and medicine being surgeons. Sure, a computer science guy cheating with ChatGPT through school and his entire career probably won’t be setting anyone back other than himself and the companies that hire him, but they aren’t the only ones using the “shortcut” that is ChatGPT

        • gmtom@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          24 hours ago

          I was never taught what log tables actually are. Anytime logarithms were brought it, it was just “type it in to your calculator and it will tell you”

          • Carrot@lemmy.today
            link
            fedilink
            arrow-up
            2
            ·
            20 hours ago

            That wasn’t my experience in school, but there’s a good chance you were just in an introductory class or similar. However, that doesn’t change anything about my argument. If you need the log of something, you knew that you needed to look up the log in a table to solve the problem. ChatGPT removes the need to even understand that you can use a log to solve a problem, and instead spits out an answer. Yes, people can use ChatGPT to accelerate learning, as one would a calculator, and in those instances I think it’s somewhat valuable if you completely ignore the fact that it will lie to your face and claim to be telling you the truth. However, anecdotally I know quite a few folks that are using it as a replacement for learning/thinking, which is the danger people are talking about.

      • Jankatarch@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        edit-2
        1 day ago

        Better comparison would be opening a song on radio and saying “see I can produce music.” You still don’t know about music production in the end.

        • gmtom@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          24 hours ago

          Personally dont think that’s a good comparison. I would say it’s more like taking a photo and claiming you know how to paint. You’re still actually cre a ting something, but using a digital tool that does it for you. You chose the subject and fiddle with setting to get a better image closer to what you want and then can take it into software to edit it further.

          Its art in its own right, but you shouldn’t directly compare it to painting.

          • Carrot@lemmy.today
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            20 hours ago

            Even that is a bad analogy, it’s like commissioning a painter to paint something for you, and then claiming you know how to paint. You told an entity that knows how to do stuff what you wanted, and it gave it to you. Sure, you can ask for tweaks here and there, but in terms of artistic knowledge, you didn’t need any and didn’t provide any, and you didn’t really directly create anything. Taking a decent photo requires more knowledge than generating something on ChatGPT. Not to mention actually being in front of the thing you want a photo of.

              • Carrot@lemmy.today
                link
                fedilink
                arrow-up
                1
                ·
                3 hours ago

                Care to explain? I think your analogy gives the credit of art creation to someone who didn’t create art, and thus is flawed.

                • gmtom@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  58 minutes ago

                  I mean i think i explained myself quite well already, and not to be insulting to you, but i dont think you’re willing to accept any argument i would make that goes against what you already beleive, since your argument against it simply you asserting your own beliefs (that AI art isnt art) as an immutable fact

      • Milk_Sheikh@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        16
        ·
        2 days ago

        There’s a key difference between using a tool to crunch a known mathematical equation (because you cannot just say “find X” to the calculator) and having to punch in the right inputs - ergo requiring understanding - and simply asking the teacher for the answer.

        Treat AI like the hermit oracle/shaman/divinator of yesteryear, and you’ll get the same results - idiots who don’t know how to think for themselves, and blindly accept what they are told.