this post was submitted on 29 Aug 2023
13 points (100.0% liked)

SneerClub

982 readers
9 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
all 30 comments
sorted by: hot top controversial new old
[–] TerribleMachines@awful.systems 11 points 1 year ago (1 children)

My worry in 2021 was simply that the TESCREAL bundle of ideologies itself contains all the ingredients needed to “justify,” in the eyes of true believers, extreme measures to “protect” and “preserve” what Bostrom’s colleague, Toby Ord, describes as our “vast and glorious” future among the heavens.

Golly gee, those sure are all the ingredients for white supremacy these folk are playing around with what, good job there are no signs of racism... right, right?!?!

In other news, I find it wild that big Yud has gone on an arc from "I will build an AI to save everyone" to "let's do a domestic terrorism against AI researchers." He should be careful, someone might this this is displaced rage at his own failure to make any kind of intellectual progress while academic AI researchers have passed him by.

(Idk if anyone remembers how salty he was when AlphaGo showed up and crapped all over his "symbolic AI is the only way" mantra, but it's pretty funny to me that the very group of people he used to say were incompetent are a "threat" to him now they're successful. Schoolyard bully stuff and wotnot.)

[–] BrickedKeyboard@awful.systems 0 points 1 year ago* (last edited 1 year ago) (1 children)

academic AI researchers have passed him by.

Just to be pedantic, it wasn't academic AI researchers. The current era of AI began here : https://www.npr.org/2012/06/26/155792609/a-massive-google-network-learns-to-identify

Academic AI researchers have never had the compute hardware to contribute to AI research since 2012, except some who worked at corporate giants (mostly deepmind) and went back into academia.

They are getting more hardware now, but the hardware required to be relevant and to develop a capability that commercial models don't already have keeps increasing. Table stakes are now something like 10,000 H100s, or about 250-500 million in hardware.

https://www.semianalysis.com/p/google-gemini-eats-the-world-gemini

I am not sure MIRI tried any meaningful computational experiments. They came up with unrunnable algorithms that theoretically might work but would need nearly infinite compute.

As you were being pedantic, allow me to be pedantic in return.

Admittedly, you might know something I don't, but I would describe Andrew Ng as an academic. These kinds of industry partnerships, like the one in that article you referred to, are really, really common in academia. In fact, it's how a lot of our research gets done. We can't do research if we don't have funding, and so a big part of being an academic is persuading companies to work with you.

Sometimes companies really, really want to work with you, and sometimes you've got to provide them with a decent value proposition. This isn't just AI research either, but very common in statistics, as well as biological sciences, physics, chemistry, well, you get the idea. Not quite the same situation in humanities, but eh, I'm in STEM.

Now, in terms of universities having the hardware, certainly these days there is no way a university will have even close to the same compute power that a large company like Google has access to. Though, "even back in" 2012, (and well before) universities had supercomputers. It was pretty common to have a resident supercomputer that you'd use. For me, and my background's orginally in physics, back then we had a supercomputer in our department, the only one at the university, and people from other departments would occasionally ask to run stuff on it. A simpler time.

It's less that universities don't have access to that compute power. It's more that they just don't run server farms. So we pay for it from Google or Amazon and so on, like everyone in the corporate world---except of course the companies that run those servers (they still have to pay costs and lost revenue). Sometimes that's subsidized by working with a big tech company, but it isn't always.

I'm not even going to get into the history of AI/ML algorithms and the role of academic contributions there, and I don't claim that the industry played no role; but the narrative that all these advancements are corporate just ain't true, compute power or no. We just don't shout so loud or build as many "products."

Yeah, you're absolutely right that MIRI didn't try any meaningful computation experiments that I've seen. As far as I can tell, their research record is... well, staring at ceilings and thinking up vacuous problems. I actually once (when I flirted with the cult) went to a seminar that the big Yud himself delivered, and he spent the whole time talking about qualia, and then when someone asked him if he could describe a research project he was actively working on, he refused to, on the basis that it was "too important to share."

"Too important to share"! I've honestly never met an academic who doesn't want to talk about their work. Big Yud is a big let down.

[–] maol@awful.systems 5 points 1 year ago (1 children)

The description of how utopians see critics ("profoundly immoral people who block the path to utopia, threatening to impede the march toward paradise, arguably the greatest moral crime one could commit") is extremely similar to the way scientologists see their critics and ex-members. I suppose at least TESCREALists have a slightly higher measure of independence than scientologists and are thus less likely to be convinced to poison a critic's dog or send them threatening letters.

[–] saucerwizard@awful.systems 6 points 1 year ago* (last edited 1 year ago) (1 children)

Not just Scientology; Peoples Temple, Aum, or Shining Path all apply too.

Edit: they also have more money and political influence at this point then the above. this isn’t good imo.

[–] NataliePortland@lemmy.ca 4 points 1 year ago (4 children)

This is a good article. I just recently heard this TESCREAL term from the Crazy Town podcast, and it's scary stuff. I don't know what some of these words are on your community sidebar and I'm not at all sure what this community is about. I'm guessing it's a hate group, otherwise you would use words everyone understands. And now I'm wondering what's even the use of this comment. And I'm probably about to get attacked by members of an online hate group

[–] dgerard@awful.systems 7 points 1 year ago* (last edited 1 year ago) (1 children)

yeah, sneerclub is a sub for those who know too much about these bozos (I've been following them since 2010 good lord) to post negativity. The linked article is the easy introduction to their foolishness.

TESCREAL is becoming the accepted academic acronym - Transhumanism, Singularity, Cosmism, Rationalism, Effective Altruism, Longtermism - even though TREACLES was right there

[–] froztbyte@awful.systems 7 points 1 year ago (3 children)

even though TREACLES was right there

I still wish there were a U in there somewhere, because ARSECULT

[–] bitofhope@awful.systems 7 points 1 year ago (2 children)

Too bad E and A should be next to each other too

[–] froztbyte@awful.systems 8 points 1 year ago

Hmm branding opportunity, “unwanted altruism”

[–] skillissuer@discuss.tchncs.de 3 points 1 year ago (2 children)
[–] bitofhope@awful.systems 1 points 1 year ago

Maybe this place needs a sister community SNEARCULT.

[–] froztbyte@awful.systems 0 points 1 year ago (1 children)

“SEARCULT is totally innocent, okay? We’re just really into good steak and Enlightened Conversations..”

[–] skillissuer@discuss.tchncs.de 2 points 1 year ago

i just wanna grill for god's sake

[–] elmtonic@lemmy.world 5 points 1 year ago

Brilliant. Can we just shoehorn in Utilitarianism too then?

[–] GorillasAreForEating@awful.systems 3 points 11 months ago

Utilitarianism

Having lurked for a long time, sneerclub is aimed at people who already have a good idea of the horror of TESCREAL groups—the point isn't to attract new members, but catharsis for those of us that have had to deal with the TechBros/Facists etc.

and for sneering, the sneering is important.

Getting real for a moment, for me, I used to be in deep with these people and then my friends in the community commited suicide due the rampant sexual abuse and I got the hell out. Sneer club was the only place the reports of assault were taken seriously, while the TESCREALs all closed ranks.

It's all a way back for me now, but I love this place. That there is a tiny part of the Internet out there that calls these people on their shit and sneers gives me so much peace.

(For sneerclubbers reading this; thanks folks, you're the best! ✨️)

[–] blakestacey@awful.systems 6 points 1 year ago

We're a point-and-laugh-at-TESCREAL-people group.

[–] saucerwizard@awful.systems 3 points 1 year ago (1 children)
[–] dgerard@awful.systems 7 points 1 year ago (1 children)

not an unreasonable fear when the linked article details the cultist death threats Torres has been getting

[–] NataliePortland@lemmy.ca 10 points 1 year ago* (last edited 1 year ago) (3 children)

ya idk maybe i was a bit hasty in tossing around 'hate group' but I would still like to just Homer Simpson right back into the bush.

get a load of these DWEEBS tryna eugenics their way into utopia! (am i doing this right?)

[–] maol@awful.systems 8 points 1 year ago

I realise that a reddit-clone forum with a load of jargon probably does look a bit, um...hate group-y, but I wouldn't describe sneerclub as a hate group. More like Scientology watchers. The reason sneer club is no longer on Reddit is because it chose to leave (during the recent changes), not because it was banned (like, say, r/incel).

[–] dgerard@awful.systems 7 points 1 year ago

they are cuddly little smol bean nerds, very endearing!!! except you know the pervasive race science

[–] froztbyte@awful.systems 6 points 1 year ago

ya idk maybe i was a bit hasty in tossing around ‘hate group’ but I would still like to just Homer Simpson right back into the bush.

speaking for myself (and likely others would agree): I understand this impulse, but choose to throw shade because these utterly moronic ideas deserve public shaming and counterpush. the fact that it's so easy to take their shit apart while they ostensibly have all those highly-paid bigbrains is, well, rather indicative of just how thoroughly put together their fuckwittery is

get a load of these DWEEBS tryna eugenics their way into utopia! (am i doing this right?)

yep

[–] TerryTPlatypus@beehaw.org 4 points 1 year ago

These people are too disconnected from reality, too prideful of their "intelligence" and "success" from their tech startups, and have read too much Isaac Asimov. smh.