this post was submitted on 15 Feb 2025
10 points (100.0% liked)
LocalLLaMA
2585 readers
7 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'll vouch for Koboldcpp. I use the CUDA version currently and it has a lot of what you'd need to get the settings that work for you. Just remember to save what works best as a .kcpps, or else you'll be putting it in manually every time you boot it up (though saving doesn't work on Linux afaik, and its a pain that it doesn't).