You know how Google’s new feature called AI Overviews is prone to spitting out wildly incorrect answers to search queries? In one instance, AI Overviews told a user to use glue on pizza to make sure the cheese won’t slide off (pssst…please don’t do this.)

Well, according to an interview at The Vergewith Google CEO Sundar Pichai published earlier this week, just before criticism of the outputs really took off, these “hallucinations” are an “inherent feature” of  AI large language models (LLM), which is what drives AI Overviews, and this feature “is still an unsolved problem.”

    • 9488fcea02a9@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      4 months ago

      I hate the AI hype right now, but to say the entire thing should fail is short sighted.

      Imagine people saying the following: “The internet is just hype. I get too much spam emails. I hope the entire thing is a catastrophic failure.”

      Imagine we just shut down the entire internet because the dotcom bubble was full of scams and overhyped…

        • KevonLooney@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          ?

          Have you never used any of these tools? They’re excellent at doing simple things very fast. But it’s like a word processor in the 90s. It’s just a tool, not the font of all knowledge.

          I guess younger people won’t know this, but word processor programs were very impressive when they first came out. They replaced typewriters; a page printed from a printer looked much more professional than even the best typewriters. This lent an air of credibility to anything that was printed from a computer because it was new and expensive.

          Think about that now. Do you automatically trust anything that’s just printed on a piece of paper? No, because that’s stupid. Anyone can just print whatever they want. LLMs are like that now. They can just say whatever they want. It’s up to you to make sure it’s true.