istewart

joined 8 months ago
[–] istewart@awful.systems 6 points 1 day ago

ChatGPT's got what intelligence craves... it's got neurons

[–] istewart@awful.systems 9 points 3 days ago

As I noted on the YouTube video, this is doubly heinous as a lot of CA community college instructors are "freeway flyers" - working at multiple campuses, sometimes almost 100 miles apart, just to cobble together a full-time work schedule for themselves. Online, self-paced, forum-based class formats were already becoming popular even before the pandemic, and I've been in such classes where the professor indicated that I was one of maybe 3 or 4 students who bothered to show up to in-person office hours. I have to wonder if that will end up being a hard requirement at some point. The bottom rung on the higher-education ladder is already the most vulnerable, and this just makes it worse.

[–] istewart@awful.systems 6 points 1 week ago (1 children)

I have to agree. There are already at least two notable and high-profile failure stories with consequences that are going to stick around for years.

  1. The Israeli military's use of "AI" targeting systems as an accountability sink in service of a predetermined policy of ethnic cleansing.
  2. The DOGE creeps wanting to rewrite bedrock federal payment systems with AI assistance.

And sadly more to come. The first story is likely to continue to get a hands-off treatment in most US media for a few more years yet, but the second one is almost certainly going to generate Tacoma Narrows Bridge-level legends of failure and necessary restructuring once professionals are back in command. The kind of thing that is put into college engineering textbooks as a dire warning of what not to do.

Of course, it's up to us to keep these failures in the public spotlight and framed appropriately. The appropriate question is not, "how did the AI fail?" The appropriate question is, "how did someone abusively misapply stochastic algorithms?"

[–] istewart@awful.systems 11 points 2 weeks ago

Would you invest in commercial real estate, knowing there was a non-zero chance your tenants might come in one day to discover a thoroughly intoxicated JD Vance in a compromising position with the break-room furniture?

[–] istewart@awful.systems 9 points 2 weeks ago (1 children)

Mesa-optimization... that must be when you rail some crushed-up Adderall XRs, boof some modafinil for good measure, and spend the night making sure your kitchen table surface is perfectly flat with no defects abrasions deviations contusions...

[–] istewart@awful.systems 15 points 2 weeks ago* (last edited 2 weeks ago)

couldn't help myself, there are seldom more perfect opportunities to use this one

[–] istewart@awful.systems 18 points 2 weeks ago

I actually think it's part-and-parcel of Yarvin's personality. As much as he rails against "the Cathedral," PMCs, whatever, he himself is a perfect example of a pathological middle manager. Somebody who wants power without having to shoulder ultimate responsibility. He craves the childishly simplified social environment of a medieval-fantasy king's court, but he doesn't want to be the king himself. He wants to be (and has been, up until now) the scheming vizier who can run his manipulation games in the background, deciding who gets in front of the king but not having to take the heat if the king makes a bad decision. (And the "kings" he works for have made plenty of bad decisions, but consequences have only just begun to catch up.)

I suspect this newfound mainstream attention is far more uncomfortable than it is validating for him. Perhaps the NYT profile was a burst of exhilaration, but the shine has worn off quickly. This correlates with the story last year about him coming back to Urbit as a "wartime CEO." If Urbit is so damn important for building his ridiculous vision, why wasn't he running it the whole time? He doesn't actually want to be CEO of anything. Power without responsibility.

[–] istewart@awful.systems 12 points 2 weeks ago (4 children)

He will never stop to reflect that his "philosophy," such as it is, is explicitly tailored for avaricious power-hungry narcissists, soooooo

[–] istewart@awful.systems 13 points 2 weeks ago

Obvious joke is obvious, but

The essay brims with false dichotomies, logical inconsistencies, half-baked metaphors, and allusions to genocide. It careens from Romanian tractor factories to Harvard being turned “into dust. Into quarks” with the coherence of a meth-addled squirrel.

Harvard isn't already full of Quarks?

[–] istewart@awful.systems 11 points 2 weeks ago (1 children)

Another thread worth pulling is that biotechnology and synthetic biology have turned out to be substantially harder to master than anticipated, and it didn't seem like it was ever the primary area of expertise for a lot of these people anyway. I don't have a copy of any of Kurzweil's books at hand to look at his predicted timelines for that stuff, but they're surely way off.

Faulty assumptions about the biological equivalence of digital neural network algorithms have done a lot of unexamined heavy lifting in driving the current AI bubble, and keeping the harder stuff on the fringes of the conversation. That said, I don't doubt that a few refugees from the bubble-burst will attempt to inflate the next bubble on the back of speculative biotech, and I've seen a couple of signs of that already.

[–] istewart@awful.systems 5 points 2 weeks ago (2 children)

For my money, 2015/16 Adams trying to sell Trump as a "master persuader" while also desperately pretending not to be an explicit Trump supporter himself was probably the most entertaining he's ever been. Once he switched from skimmable text blogging to livestreaming, though, he wanted to waste too much of my time to be interesting anymore.

[–] istewart@awful.systems 7 points 2 weeks ago (2 children)

"This Is What Yudkowsky Actually Believes" seems like a subtitle that would get heavy use in a future episode of South Park about Cartman dropping out after one semester at community college.

view more: next ›