…LLM users displayed the weakest connectivity. Cognitive activity scaled down in relation to external tool use… LLM users also struggled to accurately quote their own work. While LLMs offer immediate convenience, our findings highlight potential cognitive costs. Over four months, LLM users consistently underperformed at neural, linguistic, and behavioral levels.
Outsourcing thinking from your brain to an AI literally makes you dumber, less confident in the output, and teaches you nothing.
Call me a Luddite or a hater, but if you’re one of the people who uses AI as a shortcut to actual thought or learning, I will judge you and disregard your output and opinions. Form your own basis of understanding and knowledge instead of a teaspoon deep summary that is frequently incorrect.
They say that, when making an Anki deck, using it is only half the battle because a lot of the learning comes from the act of making it yourself. That advice is older than these LLMs and it really showcases a big reason why they suck. Personally, I haven’t even used autocorrect since 2009.
Being a luddite I feel requires having a highly abstinence-only approach. Knowing what is worth off-loading and what is worth doing yourself is just being smart. I’m really glad that I don’t need to know every detail of modern life but I still take a lot of pride in knowing how quite a lot of it works.
Oh for sure, I won’t argue that, but it does explain my point. Even when I use a program with the squiggly red line I correct it myself so that I can reinforce the correct spelling.
I remembered a movie about the future where some guy couldn’t figure out how to insert the right shape into a hole and he tried to insert a cube into a round hole, I don’t remember exactly, but it’s not so funny when it becomes reality… In any case, due to excessive comfort or convenience, the human brain, so to speak, adapts in the bad sense of the word to what is easier.
and that is exactly why rich people are demonstrably dumber and more disconnected than most: their life contains far fewer challenges they actually have to overcome.
Here I probably agree with you… by the way, did you know about secret underground cities for the elite on special nuclear reactors? They hope to avoid problems there if a collapse occurs :333
LLMs are less replacing the need for log tables, and more replacing the need to understand why you need a log table. Less replacing a calculator and more replacing the fundamental understanding of math. Sure, you could argue that it doesn’t matter if people know math, and in the end you might be right. But given that ChatGPT can and will spit out random numbers instead of a real answer, I’d rather have someone who actually understands math be designing buildings, people who actually understand anatomy and medicine being surgeons. Sure, a computer science guy cheating with ChatGPT through school and his entire career probably won’t be setting anyone back other than himself and the companies that hire him, but they aren’t the only ones using the “shortcut” that is ChatGPT
That wasn’t my experience in school, but there’s a good chance you were just in an introductory class or similar. However, that doesn’t change anything about my argument. If you need the log of something, you knew that you needed to look up the log in a table to solve the problem. ChatGPT removes the need to even understand that you can use a log to solve a problem, and instead spits out an answer. Yes, people can use ChatGPT to accelerate learning, as one would a calculator, and in those instances I think it’s somewhat valuable if you completely ignore the fact that it will lie to your face and claim to be telling you the truth. However, anecdotally I know quite a few folks that are using it as a replacement for learning/thinking, which is the danger people are talking about.
Personally dont think that’s a good comparison. I would say it’s more like taking a photo and claiming you know how to paint. You’re still actually cre a ting something, but using a digital tool that does it for you. You chose the subject and fiddle with setting to get a better image closer to what you want and then can take it into software to edit it further.
Its art in its own right, but you shouldn’t directly compare it to painting.
Even that is a bad analogy, it’s like commissioning a painter to paint something for you, and then claiming you know how to paint. You told an entity that knows how to do stuff what you wanted, and it gave it to you. Sure, you can ask for tweaks here and there, but in terms of artistic knowledge, you didn’t need any and didn’t provide any, and you didn’t really directly create anything. Taking a decent photo requires more knowledge than generating something on ChatGPT. Not to mention actually being in front of the thing you want a photo of.
I mean i think i explained myself quite well already, and not to be insulting to you, but i dont think you’re willing to accept any argument i would make that goes against what you already beleive, since your argument against it simply you asserting your own beliefs (that AI art isnt art) as an immutable fact
There’s a key difference between using a tool to crunch a known mathematical equation (because you cannot just say “find X” to the calculator) and having to punch in the right inputs - ergo requiring understanding - and simply asking the teacher for the answer.
Treat AI like the hermit oracle/shaman/divinator of yesteryear, and you’ll get the same results - idiots who don’t know how to think for themselves, and blindly accept what they are told.
Outsourcing thinking from your brain to an AI literally makes you dumber, less confident in the output, and teaches you nothing.
Call me a Luddite or a hater, but if you’re one of the people who uses AI as a shortcut to actual thought or learning, I will judge you and disregard your output and opinions. Form your own basis of understanding and knowledge instead of a teaspoon deep summary that is frequently incorrect.
They say that, when making an Anki deck, using it is only half the battle because a lot of the learning comes from the act of making it yourself. That advice is older than these LLMs and it really showcases a big reason why they suck. Personally, I haven’t even used autocorrect since 2009.
Being a luddite I feel requires having a highly abstinence-only approach. Knowing what is worth off-loading and what is worth doing yourself is just being smart. I’m really glad that I don’t need to know every detail of modern life but I still take a lot of pride in knowing how quite a lot of it works.
Genuinely the weirdest flex I have ever seen.
No autocorrect? Pfft, filthy casuals. I haven’t even stricken a line through a word since the Carter administration.
Oh for sure, I won’t argue that, but it does explain my point. Even when I use a program with the squiggly red line I correct it myself so that I can reinforce the correct spelling.
I remembered a movie about the future where some guy couldn’t figure out how to insert the right shape into a hole and he tried to insert a cube into a round hole, I don’t remember exactly, but it’s not so funny when it becomes reality… In any case, due to excessive comfort or convenience, the human brain, so to speak, adapts in the bad sense of the word to what is easier.
Idiocracy - https://www.imdb.com/title/tt0387808/
was that at the current regimes swear in right?
I already watched it the other day to be honest, but thanks anyway. :3
and that is exactly why rich people are demonstrably dumber and more disconnected than most: their life contains far fewer challenges they actually have to overcome.
The business of a king separates him from the world…
I agree all rich people are stupid people
Here I probably agree with you… by the way, did you know about secret underground cities for the elite on special nuclear reactors? They hope to avoid problems there if a collapse occurs :333
Same with using a calculator, no? Or not memorising log tables.
LLMs are less replacing the need for log tables, and more replacing the need to understand why you need a log table. Less replacing a calculator and more replacing the fundamental understanding of math. Sure, you could argue that it doesn’t matter if people know math, and in the end you might be right. But given that ChatGPT can and will spit out random numbers instead of a real answer, I’d rather have someone who actually understands math be designing buildings, people who actually understand anatomy and medicine being surgeons. Sure, a computer science guy cheating with ChatGPT through school and his entire career probably won’t be setting anyone back other than himself and the companies that hire him, but they aren’t the only ones using the “shortcut” that is ChatGPT
I was never taught what log tables actually are. Anytime logarithms were brought it, it was just “type it in to your calculator and it will tell you”
That wasn’t my experience in school, but there’s a good chance you were just in an introductory class or similar. However, that doesn’t change anything about my argument. If you need the log of something, you knew that you needed to look up the log in a table to solve the problem. ChatGPT removes the need to even understand that you can use a log to solve a problem, and instead spits out an answer. Yes, people can use ChatGPT to accelerate learning, as one would a calculator, and in those instances I think it’s somewhat valuable if you completely ignore the fact that it will lie to your face and claim to be telling you the truth. However, anecdotally I know quite a few folks that are using it as a replacement for learning/thinking, which is the danger people are talking about.
Better comparison would be opening a song on radio and saying “see I can produce music.” You still don’t know about music production in the end.
Personally dont think that’s a good comparison. I would say it’s more like taking a photo and claiming you know how to paint. You’re still actually cre a ting something, but using a digital tool that does it for you. You chose the subject and fiddle with setting to get a better image closer to what you want and then can take it into software to edit it further.
Its art in its own right, but you shouldn’t directly compare it to painting.
Even that is a bad analogy, it’s like commissioning a painter to paint something for you, and then claiming you know how to paint. You told an entity that knows how to do stuff what you wanted, and it gave it to you. Sure, you can ask for tweaks here and there, but in terms of artistic knowledge, you didn’t need any and didn’t provide any, and you didn’t really directly create anything. Taking a decent photo requires more knowledge than generating something on ChatGPT. Not to mention actually being in front of the thing you want a photo of.
I think my analogy is more accurate
Care to explain? I think your analogy gives the credit of art creation to someone who didn’t create art, and thus is flawed.
I mean i think i explained myself quite well already, and not to be insulting to you, but i dont think you’re willing to accept any argument i would make that goes against what you already beleive, since your argument against it simply you asserting your own beliefs (that AI art isnt art) as an immutable fact
There’s a key difference between using a tool to crunch a known mathematical equation (because you cannot just say “find X” to the calculator) and having to punch in the right inputs - ergo requiring understanding - and simply asking the teacher for the answer.
Treat AI like the hermit oracle/shaman/divinator of yesteryear, and you’ll get the same results - idiots who don’t know how to think for themselves, and blindly accept what they are told.