this post was submitted on 17 Jul 2024
687 points (99.0% liked)

PC Gaming

11727 readers
275 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] rtxn@lemmy.world 81 points 1 year ago (7 children)

The dedicated TPM chip is already being used for side-channel attacks. A new processor running arbitrary code would be a black hat's wet dream.

[–] MajorHavoc@programming.dev 51 points 1 year ago (1 children)

It will be.

IoT devices are already getting owned at staggering rates. Adding a learning model that currently cannot be secured is absolutely going to happen, and going to cause a whole new large batch of breaches.

The “s” in IoT stands for “security”

load more comments (6 replies)
[–] NounsAndWords@lemmy.world 68 points 1 year ago (20 children)

I would pay for AI-enhanced hardware...but I haven't yet seen anything that AI is enhancing, just an emerging product being tacked on to everything they can for an added premium.

[–] DerisionConsulting@lemmy.ca 27 points 1 year ago

In the 2010s, it was cramming a phone app and wifi into things to try to justify the higher price, while also spying on users in new ways. The device may even a screen for basically no reason.
In the 2020s, those same useless features now with a bit of software with a flashy name that removes even more control from the user, and allows the manufacturer to spy on even further the user.

[–] Fermion@feddit.nl 19 points 1 year ago

It's like rgb all over again.

At least rgb didn't make a giant stock market bubble...

[–] ryathal@sh.itjust.works 12 points 1 year ago

Anything AI actually enhanced would be advertising the enhancement not the AI part.

[–] lauha@lemmy.one 9 points 1 year ago

My Samsung A71 has had devil AI since day one. You know that feature where you can mostly use fingerprint unlock but then once a day or so it ask for the actual passcode for added security. My A71 AI has 100% success rate of picking the most inconvenient time to ask for the passcode instead of letting me do my thing.

load more comments (16 replies)
[–] rainynight65@feddit.de 39 points 1 year ago (1 children)

I am generally unwilling to pay extra for features I don't need and didn't ask for.

load more comments (1 replies)
[–] UltraGiGaGigantic@lemm.ee 31 points 1 year ago (5 children)

We're not gonna make it, are we? People, I mean.

load more comments (5 replies)
[–] crazyminner@lemmy.ml 30 points 1 year ago (1 children)

I was recently looking for a new laptop and I actively avoided laptops with AI features.

[–] lamabop@lemmings.world 18 points 1 year ago

Look, me too, but, the average punter on the street just looks at AI new features and goes OK sure give it to me. Tell them about the dodgy shit that goes with AI and you'll probably get a shrug at most

[–] n3m37h@sh.itjust.works 24 points 1 year ago (1 children)

Let me put it in lamens terms..... FUCK AI.... Thanks, have a great day

[–] iAmTheTot@sh.itjust.works 21 points 1 year ago (1 children)

FYI the term is "layman's", as of you were using the language of a layman, or someone who is not specifically experienced in the topic.

[–] krashmo@lemmy.world 19 points 1 year ago (1 children)

Sounds like something a lameman would say

load more comments (1 replies)
[–] cygnus@lemmy.ca 24 points 1 year ago (2 children)

The biggest surprise here is that as many as 16% are willing to pay more...

load more comments (1 replies)
[–] kemsat@lemmy.world 23 points 1 year ago (3 children)

What does AI enhanced hardware mean? Because I bought an Nvidia RTX card pretty much just for the AI enhanced DLSS, and I’d do it again.

[–] WhyDoYouPersist@lemmy.world 27 points 1 year ago

When they start calling everything AI, soon enough it loses all meaning. They're gonna have to start marketing things as AI-z, AI 2, iAI, AIA, AI 360, AyyyAye, etc. Got their work cut out for em, that's for sure.

load more comments (2 replies)
[–] UnderpantsWeevil@lemmy.world 21 points 1 year ago (2 children)

Okay, but here me out. What if the OS got way worse, and then I told you that paying me for the AI feature would restore it to a near-baseline level of original performance? What then, eh?

load more comments (2 replies)
[–] alessandro@lemmy.ca 21 points 1 year ago

I don't think the poll question was well made... "would you like part away from your money for..." vaguely shakes hand in air "...ai?"

People is already paying for "ai" even before chatGPT came out to popularize things: DLSS

[–] bouldering_barista@lemmy.world 21 points 1 year ago (4 children)

Who in the heck are the 16%

[–] Honytawk@lemmy.zip 16 points 1 year ago (3 children)
  • The ones who have investments in AI

  • The ones who listen to the marketing

  • The ones who are big Weird Al fans

  • The ones who didn't understand the question

[–] Glytch@lemmy.world 10 points 1 year ago

I would pay for Weird-Al enhanced PC hardware.

load more comments (2 replies)
load more comments (3 replies)
[–] qaz@lemmy.world 17 points 1 year ago* (last edited 1 year ago) (1 children)

I would pay extra to be able to run open LLM's locally on Linux. I wouldn't pay for Microsoft's Copilot stuff that's shoehorned into every interface imaginable while also causing privacy and security issues. The context matters.

[–] Blue_Morpho@lemmy.world 9 points 1 year ago (4 children)

That's why NPU's are actually a good thing. The ability to run LLM local instead of sending everything to Microsoft/Open AI for data mining will be great.

load more comments (4 replies)
[–] AVincentInSpace@pawb.social 17 points 1 year ago

I'm willing to pay extra for software that isn't

[–] smokescreen@lemmy.ca 17 points 1 year ago

Pay more for a shitty chargpt clone in your operating system that can get exploited to hack your device. I see no flaw in this at all.

[–] capital@lemmy.world 14 points 1 year ago (1 children)

My old ass GTX 1060 runs some of the open source language models. I imagine the more recent cards would handle them easily.

What’s the “AI” hardware supposed to do that any gamer with recent hardware can’t?

load more comments (1 replies)
[–] UltraMagnus0001@lemmy.world 12 points 1 year ago (1 children)

Fuck, they won't upgrade to TPM for windows 11

load more comments (1 replies)
[–] chicken@lemmy.dbzer0.com 11 points 1 year ago

I can't tell how good any of this stuff is because none of the language they're using to describe performance makes sense in comparison with running AI models on a GPU. How big a model can this stuff run, how does it compare to the graphics cards people use for AI now?

[–] ZILtoid1991@lemmy.world 9 points 1 year ago

A big letdown for me is, except with some rare cases, those extra AI features useless outside of AI. Some NPUs are straight out DSPs, they could easily run OpenCL code, others are either designed to not be able to handle any normal floating point numbers but only ones designed for machine learning, or CPU extensions that are just even bigger vector multipliers for select datatypes (AMX).

load more comments
view more: next ›