this post was submitted on 31 Jan 2025
71 points (83.8% liked)

Technology

61344 readers
4575 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] AtHeartEngineer@lemmy.world 20 points 2 days ago (2 children)

If AMD was smart they would release an upper-mid range card with like 40+ gb of vram. Doesn't event have to be their high end card, people wanting to do local/self serve AI stuff would swarm on those.

[–] foggenbooty@lemmy.world 2 points 1 day ago (1 children)

Please just give us self hosting nerds SR-IOV on affordable cards. I really want to have a Linux VM and Windows VM that both have access to a GPU simultaneously.

I was hoping Intel would let some of these enterprise locked features trickle down as a value add, but no dice. Every year AMD just undercuts NVIDIA by a small amount, but it doesn't compete on some of that tech NVIDIA has so it's a wash.

But they're too concerned it would eat into their enterprise cards where they make boatloads, so it's not going to happen. Imagine if consumer CPUs didn't support virtualization, it would be insane and that's where we are with GPUs today.

[–] MalMen@masto.pt 1 points 1 day ago

@foggenbooty @AtHeartEngineer spent hours and hours trying to have my gpu passthroigh ony vms and only found agony and dispair

[–] eager_eagle@lemmy.world 5 points 2 days ago* (last edited 2 days ago) (1 children)

yeah, I've been wanting a card like that to run local models since 2020 when I got a 3080. Back then I'd have spent a bit more to get one with the same performance but some 20GB of VRAM.

Nowadays, if they released an RX 9070 with at least 24GB at a price between the 16GB model and an RTX 5080 (also 16GB); that would be neat.

[–] AtHeartEngineer@lemmy.world 3 points 2 days ago

Same, I've got a modded 2080ti with 22gb of vram running deepseek 32b and it's great... But it's an old card, and with it being modded idk what the life expectancy is.