this post was submitted on 07 May 2025
724 points (100.0% liked)

TechTakes

1834 readers
330 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] dreugeworst@lemmy.ml 24 points 2 days ago* (last edited 2 days ago) (2 children)

afaict they're computers with a GPU that has some hardware dedicated to the kind of matrix multiplication common in inference in current neural networks. pure marketing BS because most GPUs come with that these days, and some will still not he powerful enough to be useful

[–] blarth@thelemmy.club 1 points 1 day ago

This comment is the most importantly one in this thread. Laptops already had GPUs. Does the copilot button actually result in you conversing with an LLM locally or is inference done in the cloud? If the latter, it’s even more useless.