this post was submitted on 02 Oct 2024
331 points (91.5% liked)

Technology

59438 readers
2955 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Built on unearned hype.

you are viewing a single comment's thread
view the rest of the comments
[–] brucethemoose@lemmy.world 13 points 1 month ago* (last edited 1 month ago) (1 children)

I would only use the open source models anyway, but it just seems rather silly from what I can tell.

I feel like the last few months have been an inflection point, at least for me. Qwen 2.5, and the new Command-R, really make a 24GB GPU feel "dumb, but smart," useful enough so I pretty much always keep Qwen 32B loaded on the desktop for its sheer utility.

It's still in the realm of enthusiast hardware (aka a used 3090), but hopefully that's about to be shaken up with bitnet and some stuff from AMD/Intel.

Altman is literally a vampire though, and thankfully I think he's going to burn OpenAI to the ground.