this post was submitted on 12 Feb 2025
56 points (96.7% liked)
LocalLLaMA
2590 readers
6 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm inclined to agree. Lower bandwidth might make some tasks take longer, but you can still accomplish them if you're patient. When you're out of RAM, you're out of RAM.
Time to let it page out to a platter hard drive so you can be extra patient while it takes an eternity performing memory swaps at 5400 RPM 😂
Worried that AI is going to overthrow humankind? Cripple it by making it run off a mechanical hard drive!