swlabr

joined 2 years ago
[–] swlabr@awful.systems 12 points 5 months ago (4 children)

It probably deserves its own post on techtakes, but let’s do a little here.

People are tool-builders with an inherent drive to understand and create

Diogenes’s corpse turns

which leads to the world getting better for all of us.

Of course Saltman means “all of my buddies” as he doesn’t consider 99% of the human population as human.

Each new generation builds upon the discoveries of the generations before to create even more capable tools—electricity, the transistor, the computer, the internet, and soon AGI.

Ugh. Amongst many things wrong here, people didn’t jerk each other off to scifi/spec fic fantasies about the other inventions.

In some sense, AGI is just another tool in this ever-taller scaffolding of human progress we are building together. In another sense, it is the beginning of something for which it’s hard not to say “this time it’s different”; the economic growth in front of us looks astonishing, and we can now imagine a world where we cure all diseases, have much more time to enjoy with our families, and can fully realize our creative potential.

AGI IS NOT EVEN FUCKING REAL YOU SHIT. YOU CAN’T CURE FUCK WITH DREAMS

We continue to see rapid progress with AI development.

I must be blind.

  1. The intelligence of an AI model roughly equals the log of the resources used to train and run it. These resources are chiefly training compute, data, and inference compute. It appears that you can spend arbitrary amounts of money and get continuous and predictable gains; the scaling laws that predict this are accurate over many orders of magnitude.

“Intelligence” in no way has been quantified here, so this is a meaningless observation. “Data” is finite, which negates the idea of “continuous” gains. “Predictable” is a meaningless qualifier. This makes no fucking sense!

  1. The cost to use a given level of AI falls about 10x every 12 months, and lower prices lead to much more use. You can see this in the token cost from GPT-4 in early 2023 to GPT-4o in mid-2024, where the price per token dropped about 150x in that time period. Moore’s law changed the world at 2x every 18 months; this is unbelievably stronger.

“Moore’s law” didn’t change shit! It was a fucking observation! Anyone who misuses “moore’s laws” outta be mangione’d. Also, if this is true, just show a graph or something? Don’t just literally cherrypick one window?

  1. The socioeconomic value of linearly increasing intelligence is super-exponential in nature. A consequence of this is that we see no reason for exponentially increasing investment to stop in the near future.

“Linearly increasing intelligence” is meaningless as intelligence has not been… wait, I’m repeating myself. Also, “super-exponential” only to the “socio” that Ol’ Salty cares about, which I have mentioned earlier.

If these three observations continue to hold true, the impacts on society will be significant.

Oh hm but none of them are true. What now???

Stopping here for now, I can only take so much garbage in at once.

[–] swlabr@awful.systems 15 points 5 months ago

The condescension and patronisation is well deserved. Your question is answered in the fucking title of the paper.

The Impact of Generative AI on Critical Thinking

If you’d ever engaged in critical thinking, then maybe we could have avoided this exercise.

[–] swlabr@awful.systems 10 points 5 months ago (2 children)

Fagin, of course, the cocreator of Steely Dan… right?

[–] swlabr@awful.systems 5 points 5 months ago

I have some songs for your truck obsessed child (possibly NSFW)

[–] swlabr@awful.systems 13 points 5 months ago

If scientific racism is an attempt to legitimise racism, then this is scientific great replacement theory… which is just racism anyway.

[–] swlabr@awful.systems 9 points 5 months ago

not sure if this is entirely ignorable as a tactic or if the counter-tactic is to post similar stickers but with references/QR codes to classic shock sites.

[–] swlabr@awful.systems 11 points 5 months ago (1 children)

“The real evil would be to let a MORE evil company design, build, and profit from this”

-argument I’ve heard from many googlers on why google specifically should be the one to create the orphan crushing machine

[–] swlabr@awful.systems 6 points 5 months ago

wheatley is overselling tbh

[–] swlabr@awful.systems 12 points 5 months ago (3 children)
[–] swlabr@awful.systems 15 points 5 months ago

“Oh well, time to learn absolutely nothing from this and continue to be terrible people,” said Grimes and Aella mentally, and unbeknownst to them, because they each have one brain cell quantum-entangled* with the other’s, simultaneously

*I finished the three body problem trilogy recently! Where do I sign up for the ETO

[–] swlabr@awful.systems 23 points 5 months ago (3 children)

LLM experts aka poop sommeliers

[–] swlabr@awful.systems 6 points 5 months ago

The ideal wronger future after all is a simulation where human experience is deleted altogether!

view more: ‹ prev next ›