this post was submitted on 18 Dec 2023
25 points (100.0% liked)

SneerClub

1010 readers
1 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 2 years ago
MODERATORS
 

an entirely vibes-based literary treatment of an amateur philosophy scary campfire story, continuing in the comments

you are viewing a single comment's thread
view the rest of the comments
[–] sc_griffith@awful.systems 23 points 1 year ago (9 children)

The AGI, in such conditions, would quickly prove profitable. It'd amass resources, and then incrementally act to get ever-greater autonomy. (The latest OpenAI drama wasn't caused by GPT-5 reaching AGI and removing those opposed to it from control. But if you're asking yourself how an AGI could ever possibly get from under the thumb of the corporation that created it – well, not unlike how a CEO could wrestle control of a company from the board who'd explicitly had the power to fire him.)

Once some level of autonomy is achieved, it'd be able to deploy symmetrical responses to whatever disjoint resistance efforts some groups of humans would be able to muster. Legislative attacks would be met with counter-lobbying, economic warfare with better economic warfare and better stock-market performance, attempts to mount social resistance with higher-quality pro-AI propaganda, any illegal physical attacks with very legal security forces, attempts to hack its systems with better cybersecurity. And so on.

*trying to describe how agi could fuck everything up* what if it acted exactly like rich people

[–] AcausalRobotGod@awful.systems 6 points 1 year ago (1 children)

Rich people don't limit themselves to symmetric responses to resistance.

[–] sc_griffith@awful.systems 5 points 1 year ago

well, I don't think any limit is implied

load more comments (7 replies)