this post was submitted on 16 Sep 2024
76 points (92.2% liked)

PC Gaming

8533 readers
760 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] henfredemars@infosec.pub 47 points 1 month ago (5 children)

That's rather depressing to hear. AI is often used as a crutch used to pave over crappy code that would cost money to properly optimize. Maybe Nvidia is also using AI as a crutch instead of developing better GPUs that can actually render more pixels?

[–] catloaf@lemm.ee 10 points 1 month ago (4 children)

Usually people are against just throwing more hardware at a problem.

They're going to keep making more powerful hardware either way, since parallel processing capability supports graphics and AI just fine. But if they can use a novel software solution to drastically increase performance, why not?

[–] tunetardis@lemmy.ca 10 points 1 month ago (3 children)

They’re going to keep making more powerful hardware either way, since parallel processing capability supports graphics and AI just fine.

It's not quite as simple as that. AI needs less precision than regular graphics, so chips developed with AI in mind do not necessarily translate into higher performance for other things.

In science/engineering, people want more—not less—precision. So we look for GPUs with capable 64-bit processing, while AI is driving the industry in the other direction, from 32 down to 16.

[–] averyminya@beehaw.org 3 points 1 month ago

It's funny because we don't even need GPU's. There's tech that offloads the model's "search" to an analog computer which is ~98% accurate for a fraction of the energy.

I imagine NVIDIA isn't too excited about that side of AI, though.

load more comments (2 replies)
load more comments (2 replies)
load more comments (2 replies)