MasterNerd@lemm.eetoSelfhosted@lemmy.world•Is it possible to run a LLM on a mini-pc like the GMKtec K8 and K9?English
5·
2 months agoLook into ollama. It shouldn’t be an issue if you stick to 7b parameter models
Look into ollama. It shouldn’t be an issue if you stick to 7b parameter models
Yeah I love Lemmy, but I feel like it’s being held up by a few different people. I comment and even occasionally post to try and help with that, but it’s kind of scary how much weight is on a few peoples’ shoulders, and that’s not even accounting for server owners and the devs
I don’t have any experience with them honestly so I can’t help you there