this post was submitted on 25 Dec 2023
1888 points (97.8% liked)

People Twitter

5220 readers
2107 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a tweet or similar
  4. No bullying or international politcs
  5. Be excellent to each other.

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[โ€“] CeeBee@lemmy.world 2 points 10 months ago (1 children)

I don't know of an LLM that works decently on personal hardware

Ollama with ollama-webui. Models like solar-10.7b and mistral-7b work nicely on local hardware. Solar 10.7b should work well on a card with 8GB of vram.

[โ€“] ParetoOptimalDev@lemmy.today 1 points 10 months ago

If you have really low specs use the recently open sourced Microsoft Phi model.