this post was submitted on 13 Sep 2023
16 points (100.0% liked)

SneerClub

1011 readers
2 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 2 years ago
MODERATORS
 

source nitter link

@EY
This advice won't be for everyone, but: anytime you're tempted to say "I was traumatized by X", try reframing this in your internal dialogue as "After X, my brain incorrectly learned that Y".

I have to admit, for a brief moment i thought he was correctly expressing displeasure at twitter.

@EY
This is of course a dangerous sort of tweet, but I predict that including variables into it will keep out the worst of the online riff-raff - the would-be bullies will correctly predict that their audiences' eyes would glaze over on reading a QT with variables.

Fool! This bully (is it weird to speak in the third person ?) thinks using variables here makes it MORE sneer worthy, especially since this appear to be a general advice, but i would struggle to think of a single instance in my life where it's been applicable.

you are viewing a single comment's thread
view the rest of the comments
[–] TinyTimmyTokyo@awful.systems 11 points 1 year ago (2 children)

I wonder if he's ever applied this advice to himself. Because one could argue that trauma was a significant factor in his obsession with transhumanism and the singularity.

When Yud's younger brother died tragically at age 19, it clearly traumatized him. In this case, X was "the death of my little brother". From this he learned Y: to be angry and fearful of death ("You do not make peace with Death!"). His fascination with the singularity can be seen in this light as a wish to cheat death, while his more recent AI doomerism is the singularity's fatalistic counterpart: an eschatological distortion and acceleration of the reality that death comes for us all.

[–] Anthena@awful.systems 4 points 1 year ago

Ah, well, his counterargument to that is straightforwardly that he learned the correct thing from that - death sucks and must be overcome!

Hm? What's this? All his efforts have been based on delusional approaches a small bit of research should have been enough to disprove? Ah, well, nevertheless!

[–] swlabr@awful.systems 3 points 1 year ago

this would require self awareness, so no, this has not happened.