• Flying Squid@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    7 days ago

    LLMs are already training themselves on their own conversations and hallucinations become a feedback loop where they become more and more “factual” as the training continues.

    • Tartas1995@discuss.tchncs.de
      link
      fedilink
      arrow-up
      5
      ·
      7 days ago

      Yeah that is my point.

      It will be more and more conservative over time. As I don’t think you should use it for anything but inspiration, I am saying that it will encourage a more and more conservative approach to e.g. coding.