I literally don’t care, AT ALL, about someone who’s too dumb not to kill themselves because of a LLM and we sure as shit shouldn’t regulate something just because they (unfortunately) exist.
Yeah, if you had any awareness about how stupid and unlikeable you’re coming across to everybody who crosses your path, I think you would recognise that this is probably not a good maxim to live your life by.
It should be noted that the only person to lose his life in the article was because the police, who were explicitly told to be ready to use non-lethal means to subdue him because he was in the middle of a mental episode, immediately gunned him down when they saw him coming at them with a kitchen knife.
But here’s the thrice cursed part:
“You want to know the ironic thing? I wrote my son’s obituary using ChatGPT,” Mr. Taylor said. “I had talked to it for a while about what had happened, trying to find more details about exactly what he was going through. And it was beautiful and touching. It was like it read my heart and it scared the shit out of me.”
i for sure agree that LLMs can be a huge trouble spot for mentally vulnerable people and there needs to be something done about it
my point was more on him using it to do his worst-of-both-worlds arguments where he’s simultaneously saying that ‘alignment is FALSIFIED!’ and also doing heavy anthropomorphization to confirm his priors (whereas it’d be harder to say that with something that’s more leaning towards maybe in the question whether it should be anthro’d like claude since that has a much more robust system) and doing it off the back of someones death
@Anomalocaris@visaVisa The attention spent on people who think LLMs are going to evolve into The Machine God will only make good regulation & norms harder to achieve
can we agree they Yudkowsky is a bit of a twat.
but also that there’s a danger in letting vulnerable people access LLMs?
not saying that they should me banned, but some regulation and safety is necessary.
I literally don’t care, AT ALL, about someone who’s too dumb not to kill themselves because of a LLM and we sure as shit shouldn’t regulate something just because they (unfortunately) exist.
Yeah, if you had any awareness about how stupid and unlikeable you’re coming across to everybody who crosses your path, I think you would recognise that this is probably not a good maxim to live your life by.
It should be noted that the only person to lose his life in the article was because the police, who were explicitly told to be ready to use non-lethal means to subdue him because he was in the middle of a mental episode, immediately gunned him down when they saw him coming at them with a kitchen knife.
But here’s the thrice cursed part:
" I don’t care if innocent people die if it inconvenience me in some way."
yhea, opinion dismissed
it didn’t take me long at all to find the most recent post with a slur in your post history. you’re just a bundle of red flags, ain’t ya?
don’t let that edge cut you on your way the fuck out
i for sure agree that LLMs can be a huge trouble spot for mentally vulnerable people and there needs to be something done about it
my point was more on him using it to do his worst-of-both-worlds arguments where he’s simultaneously saying that ‘alignment is FALSIFIED!’ and also doing heavy anthropomorphization to confirm his priors (whereas it’d be harder to say that with something that’s more leaning towards maybe in the question whether it should be anthro’d like claude since that has a much more robust system) and doing it off the back of someones death
yhea, we should me talking about this
just not talking with him
@Anomalocaris @visaVisa The attention spent on people who think LLMs are going to evolve into The Machine God will only make good regulation & norms harder to achieve
yhea, we need reasonable regulation now. about the real problems it has.
like making them liability for training on stolen data,
making them liable for giving misleading information, and damages caused by it…
things that would be reasonable for any company.
do we need regulations about it becoming skynet? too late for that mate