this post was submitted on 07 Feb 2024
190 points (95.2% liked)

Technology

59438 readers
3041 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Abacus.ai:

We recently released Smaug-72B-v0.1 which has taken first place on the Open LLM Leaderboard by HuggingFace. It is the first open-source model to have an average score more than 80.

you are viewing a single comment's thread
view the rest of the comments
[–] simple@lemm.ee 45 points 9 months ago (10 children)

I'm afraid to even ask for the minimum specs on this thing, open source models have gotten so big lately

[–] girsaysdoom@sh.itjust.works 5 points 9 months ago (5 children)

I think I read somewhere that you'll basically need 130 GB of RAM to load this model. You could probably get some used server hardware for less than $600 to run this.

[–] ArchAengelus@lemmy.dbzer0.com 10 points 9 months ago

Unless you’re getting used datacenter grade hardware for next to free, I doubt this. You need 130 gb of VRAM on your GPUs

load more comments (4 replies)
load more comments (8 replies)