fnix

joined 1 year ago
[–] fnix@awful.systems 5 points 1 month ago

Testing for genetic defects is very different from the Gattaca-premise of most everything about a person being genetically deterministic, with society ordered around that notion. My point was that such a setting is likely inherently impossible, since “heritability” doesn’t work like that; the most techbros can do is LARP at it, which, granted, can be very dangerous on its own – the fact that race is a social construct doesn’t preclude racism and so on. But there’s no need to get frightened by science fiction when science facts tell a different story.

[–] fnix@awful.systems 2 points 1 month ago (2 children)

Well, in the same way that Mars colonies are here now. Techbros with more money than sense throwing it at things with futuristic aesthetics doesn’t make them real.

[–] fnix@awful.systems 4 points 1 month ago

Aren’t you supposed to try to hide your psychopathic instincts? I wonder if he’s knowingly bullshitting or if he’s truly gotten high on his own supply.

[–] fnix@awful.systems 2 points 3 months ago* (last edited 3 months ago)

If I wanted to sound like a rationalist I'd tell Scott to check his fallacies, specifically category error. It's just such basic, wilful misconstrual on his part. Yeah, me liking my spaghetti quite salty doesn't mean I want to add salt to the dessert!

That's all besides the original point being that a rigged system is one where the best do not rise to the top, so even if our socioeconomic system and... Starcraft streamers (lol) were comparable categorically, which shouldn't have to be said they in any way aren't, the OG point is precisely that so much talent goes underutilized and glory unrealized due to a lack of broad cultivation and opportunity.

I don't get what makes people this way, with such small souls, just painstakingly intent on being miserly. Same thing with JK Rowling, she has all the money in the world to have the wildest pleasures or to leave everything and go off to some yurt for a spiritual search and instead she just purposefully acts in the most destructive and self-constricting manner. And this applies more generally to the awash-in-cash techbro and rationalist sets as well. You have the resources to do really interesting things, and yet you dedicate your time to making Juiceros.

[–] fnix@awful.systems 10 points 3 months ago (1 children)

Amazing quote he included from Tyler Cowen:

If you are ever tempted to cancel somebody, ask yourself “do I cancel those who favor tougher price controls on pharma? After all, they may be inducing millions of premature deaths.” If you don’t cancel those people — and you shouldn’t — that should broaden your circle of tolerance more generally.

Yes leftists, you not cancelling someone campaigning for lower drug prices is actually the same as endorsing mass murder and hence you should think twice before cancelling sex predators. It’s in fact called ephebophilia.

What the globe emoji followed with is also a classic example of rationalists getting mesmerized by their verbiage:

What I like about this framing is how it aims to recalibrate our sense of repugnance in light of “scope insensitivity,” a deeply rooted cognitive bias that occurs “when the valuation of a problem is not valued with a multiplicative relationship to its size.”

 

Thank you sir, I didn’t know the way to fix ailing welfare states was to make ChatGPT available to all.

It is truly the ultimate technofix.

[–] fnix@awful.systems 5 points 7 months ago* (last edited 7 months ago)

That is high praise indeed, but I believe the good mayor has yet to make clear to everyone that, as an acausal manifestation of the godhead, self-driving cars serve to remind us to spend at least an hour a day in silent contemplation over how to bring ASI into existence, lest one should incure the Serpent's eternal wrath in the Simulation.

[–] fnix@awful.systems 5 points 7 months ago
  1. THE BASILISK WILL COME FOR YOU ALL
[–] fnix@awful.systems 18 points 8 months ago* (last edited 8 months ago) (4 children)

Where did you get that impression from? He says himself he is not advocating against aid per se, but that its effects should be judged more holistically, e.g. that organizations like GiveWell should also include the potential harms alongside benefits in their reports. The overarching message seems to be one of intellectual humility – to not lose sight that the ultimate aim is to help another human being who in the end is a person with agency just like you, not to feel good about yourself or to alleviate your own feelings of guilt.

The basic conceit of projects like EA is the incredible high of self-importance and moral superiority one can get blinded by when one views themselves as more important than other people by virtue of helping so many of them. No one likes to be condescended to; sure, a life saved with whatever technical fix is better than a life lost, but human life is about so much more than bare material existence – dignity and freedom are crucial to a good life. The ultimate aim should be to shift agency and power into the hands of the powerless, not to bask in being the white knight trotting around the globe, saving the benighted from themselves.

[–] fnix@awful.systems 6 points 8 months ago

This is a long but great read that gets to the very human follies behind the hyper-rational exterior of EA. Highly recommended!

[–] fnix@awful.systems 6 points 11 months ago

Not Just zhe Autobahn, but zhe Highest Altruismus: Zhe Effective Altruist Case für Replacing Degenerate Stock vith Herrenvolk

[–] fnix@awful.systems 6 points 1 year ago (7 children)

the only future in that direction is one where they’re doing a much more painful version of the same job (programming against cookie cutter LLM code) for much, much less pay.

To the extent that LLMs actually make programming more “productive”, isn’t the situation analogous to the way the power loom was bad for skilled handweavers whilst making textiles more affordable for everyone else?

I should perhaps say that I’m saying this as someone who is just starting out as a web developer (really chose the right time for that, hah). I try to avoid LLMs and even strictly unnecessary libraries for now because I like learning about how everything works under the hood and want to get an intimate grasp of what I’m doing, but I can also see that ultimately that’s not what people pay you for that and that once you’ve built up sufficient skill to quickly parse LLM output, the demands of the market may make using them unavoidable.

To be honest, I feel as conflicted & anxious about it all as others already mentioned. Maybe I am just too green to fully understand the value that I would eventually bring, but can I really, in good conscience, say that a customer should pay me more when someone else can provide a similar product that’s “good enough” at a much lower price?

Sorry for being another bummer. :(

[–] fnix@awful.systems 4 points 1 year ago

I got introduced to the genre through Star Trek and I always found its moral vision, in addition to all the weekly alien weirdness & how it was approached with patient curiosity, strongly appealing. Roddenberry set out to create an explicit alternative to the impoverished perspectives of the Cold War era. The Prime Directive is non-interventionist to a fault.

view more: next ›