- cross-posted to:
- technology@lemmy.world
- wheresyouredat@rss.ponder.cat
- cross-posted to:
- technology@lemmy.world
- wheresyouredat@rss.ponder.cat
Spooky stuff that helps explain a lot of the dysfunction flowing out from Microsoft.
Spooky stuff that helps explain a lot of the dysfunction flowing out from Microsoft.
It seems that happens to management at every company, at various strength. I swear there must be a source for all this shit, like Forbes or something.
A side note:
No, just no.
Everything generative AI produces is a hallucination.
Some may correlate with reality, but it is still a hallucination.
LLMentalist
I don’t like how people use “hallucinations” to refer to the output of neural networks, but you know what, it is all hallucination. It’s hallucination on our part, looking at arbitrary sequences of tokens and seeing meaningful text. It’s pareidolia, and it’s powerful.
I think he’s underestimating the intentionality at play here. The dynamic he’s describing (and describing very well!) has been evident since the first chatbot, ELIZA. I don’t believe that Saltman and friends don’t know about this dynamic, and I’ll give them benefit of the doubt that they didn’t think we had AGI in the 80s with basic text templates.