Okay, two separate thoughts here:
- Paul G is so fucking close to getting it, Christ on a bike
- How the fuck do you get burned by someone as soulless as Sam Altman
Okay, two separate thoughts here:
Musk says: “At times, I think Grok-3 is kind of scary smart.” Grok is just remixing its training data — but a stochastic parrot is still more reality-based than Elon Musk. [Bloomberg, archive]
If someone roasted me such such surgical precision like that, I'd delete my entire Internet presence out of shame. God damn.
I'm mentally filing this next to that clip of a black guy telling someone to just call him the N-word.
That opens you up to getting accused of click fraud, as AdNauseam found out the hard way but its worth it if you can squeeze some cash out of them before that happens.
Sentiment analysis surrounding AI suggests sneers are gonna moon pretty soon. Good news for us, since we've been stacking sneers for a while.
Recently stumbled upon an anti-AI mutual aid/activism group that's being set up, I suspect some of you will be interested.
xAI has applied for permits for the first set of turbines. But it won’t install pollution controls unless and until its permits are approved. At that point, xAI will be “the lowest-emitting facility in the country,” allegedly.
Musk probably sees gassing black people as a free bonus for installing the turbines, I strongly doubt he's installing pollution controls.
As a famous swindler once said, there's a sucker born every minute.
New article from Jared White: Sorry, You Don’t Get to Die on That “Vibe Coding” Hill, aimed at sneering the shit out of one of Simon Willson's latest blogposts. Here's a personal highlight of mine:
Generative AI is tied at the hip to fascism (do the research if you don’t believe me), and it pains me to see pointless arguments over what constitutes “vibe coding” overshadow the reality that all genAI usage is anti-craft and anti-humanist and in fact represents an extreme position.
Baldur Bjarnason's given his thoughts on Bluesky:
My current theory is that the main difference between open source and closed source when it comes to the adoption of “AI” tools is that open source projects generally have to ship working code, whereas closed source only needs to ship code that runs.
I’ve heard so many examples of closed source projects that get shipped but don’t actually work for the business. And too many examples of broken closed source projects that are replacing legacy code that was both working just fine and genuinely secure. Pure novelty-seeking
Personal rule of thumb: all autoplag is serious until proven satire.
It was me, I stole all the sexy robots