this post was submitted on 07 Dec 2023
304 points (97.5% liked)
Technology
59201 readers
3022 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I would kill to run my models on my own AMD linux server.
Does GPT4all not allow that? Or do you have specific other models?
I haven't super looked into it but I'm not interested in playing the GPU game against the gamers so if AMD can do a Tesla equivalent with gobs of RAM and no display hardware I'd be all about it.
Right now it's looking like I'm going to build a server with a pair of K80s off ebay for a hundred bucks which will give me 48GB of RAM to run models in.
Some of the LLMs it ships with are very reasonably sized and still be impressive. I can run them on a laptop with 32GB of RAM.
This is very interesting! Thanks for the link. I'll dig into this when I manage to have some time.