this post was submitted on 12 Feb 2025
56 points (96.7% liked)
LocalLLaMA
2585 readers
7 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I have never in my life regretted taking higher RAM options. Even at the cost of overall performance more RAM has always been the right choice. Especially in the long run. It greatly increased the longevity of my systems.
But I do have several systems where I regret them having too little RAM. That's usually with systems where I didn't have any other choice, like the Steam Deck.
I'm inclined to agree. Lower bandwidth might make some tasks take longer, but you can still accomplish them if you're patient. When you're out of RAM, you're out of RAM.
Time to let it page out to a platter hard drive so you can be extra patient while it takes an eternity performing memory swaps at 5400 RPM 😂
Worried that AI is going to overthrow humankind? Cripple it by making it run off a mechanical hard drive!