• Pasta Dental@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    66
    arrow-down
    1
    ·
    2 months ago

    Ill believe it when I see it: an LLM is basically a random box, you can’t 100% patch it. Their only way for it to stop generating bomb recipes is to remove that data from the training