I’ve just re-discovered ollama and it’s come on a long way and has reduced the very difficult task of locally hosting your own LLM (and getting it running on a GPU) to simply installing a deb! It also works for Windows and Mac, so can help everyone.
I’d like to see Lemmy become useful for specific technical sub branches instead of trying to find the best existing community which can be subjective making information difficult to find, so I created !Ollama@lemmy.world for everyone to discuss, ask questions, and help each other out with ollama!
So, please, join, subscribe and feel free to post, ask questions, post tips / projects, and help out where you can!
Thanks!
Instance independent link: !Ollama@lemmy.world
Share links to communities this way, so everyone can subscribe easily.
You should also post about this in !newcommunities@lemmy.world and !communitypromo@lemmy.ca for better discoverability!
Thanks will do all that!