• Ech@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      19 days ago

      I was able to run llms on a 1080. They were admittedly small ones, but a 3090 is enough to be usable. That said, I’m not convinced it’s a good idea to use it for therapy. I expect it’s about as useful as talking into a mirror.

    • Monkey With A Shell@lemmy.socdojo.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      19 days ago

      I think I’m at a 3060 or so and it works decently depending on the model. I can generally get away with around 13B, or some 20+ Q4 or so but they get real slow by that point.

      It’s a lot of messing around to find something that performs decent while not being so limited as to get crazy repetitive or saying loony things.

    • chicken@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      18 days ago

      3090 is great because it has about the same amount of vram as newer cards, and vram is what determines which models you can run on it.

    • SchizoDenji@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      18 days ago

      That’s the best gpu for this usecase. Maybe use 2x if you want the best of the best.