TheGrandNagus

joined 1 year ago
[–] TheGrandNagus@lemmy.world 1 points 4 hours ago (1 children)

That's not what was said. What was said was anybody using it in any capacity for any job should be fired.

Which is obviously a very, very stupid take.

[–] TheGrandNagus@lemmy.world 1 points 4 hours ago

Literally none of that goes against what I've said.

[–] TheGrandNagus@lemmy.world 8 points 10 hours ago* (last edited 10 hours ago) (1 children)

100 experiences that define 25-35 year old social media-posting middle-class Britons

[–] TheGrandNagus@lemmy.world 3 points 11 hours ago

Doesn't look bad.

[–] TheGrandNagus@lemmy.world 2 points 15 hours ago* (last edited 15 hours ago)

Nobody is talking about making vaping illegal entirely.

Alcohol already usually has restrictions on drinking it when walking down the street.

Sure there are some places (e.g. UK) where it isn't, but even there local authorities have the power to forbid it if they want to.

Currently smoking doesn't have this, and neither does vaping.

[–] TheGrandNagus@lemmy.world -1 points 15 hours ago (2 children)

Not really true.

And locally-run translation that utilises AI, as well as AI accessibility features for blind users isn't nefarious.

People need to actually look into features before they have a stupid and completely reactionary "it says AI therefore evil" response.

[–] TheGrandNagus@lemmy.world 2 points 16 hours ago

Here is literally no different.

[–] TheGrandNagus@lemmy.world 6 points 16 hours ago* (last edited 4 hours ago) (5 children)

Don't you be bringing nuance into this.

If you used an LLM to find that mistyped variable name, you deserve to lose your job. You and your family must suffer.

If you are blind and you use a screen reader with some AI features, you should be fired and that tech needs to be taken from you. You must suffer.

Honestly we should just kill them, even.

[–] TheGrandNagus@lemmy.world 4 points 16 hours ago

You're right. Within 10 seconds I just found an article from 2006 saying just that. Earlier ones likely exist.

[–] TheGrandNagus@lemmy.world 7 points 17 hours ago* (last edited 16 hours ago) (4 children)

Indeed. GPs have been doing this for a long time. It's nothing new, and expecting every GP to know every single ailment that humanity has ever experienced, to recall it quickly, and immediately know the course of action to take, is unreasonable. They are only human.

Like you say, if they're blindly following a generic ChatGPT instance trained on whatever crap it's scraped from the internet, then that's bad.

If they're aiding their search using an LLM that has been trained on a good medical dataset, then taking that and looking more into it, then there's no issue.

People have become so reactionary to LLMs and other AI stuff. It seems there's a "omg it's so cool everybody should use it to the max. Let's blindly trust it!" camp and a "it's awful and shouldn't exist, burn it all! No algorithms or machine learning anywhere. New tech is bad!"

Both camps are just as stupid. There's zero nuance in the discussion about this stuff, and it's tiring.

[–] TheGrandNagus@lemmy.world 2 points 18 hours ago

And they're the only people who can easily do it.

Anybody else needs a new motherboard and RAM. And for those people, they're like "hmmm I can spend $700+ upgrading to Zen5, or I could spend $180 on a 5700X3D, not have to pull my entire PC apart, and get about the same real-world performance because I'll be GPU bottlenecked anyway."

[–] TheGrandNagus@lemmy.world 2 points 18 hours ago

Both the PS5 and Steam deck's CPU architecture are Zen2. So the 5950X is the most modern (Zen3)

 
 

This appears to be a move to counter the UMPK gliding bombs Russia has started using recently to great effect against Ukraine.

Russia can launch these from Russian soil, safe from Ukrainian fire. These missiles will allow Ukraine to strike grounded planes and weapons stockpiles in Russia.

It's an interesting move, considering the US has been telling Ukraine not to use any western long-range weapons against Russia directly.

view more: next ›