rook

joined 1 year ago
[–] rook@awful.systems 7 points 5 days ago (6 children)

Valsorda was on mastodon for a bit (in ‘22 maybe?) and was quite keen on it , but left after a bunch of people got really pissy at him over one of his projects. I can’t actually recall what it even was, but his argument was that people posted stuff publicly on mastodon, so he should be able to do what he liked with those posts even if they asked him not to. I can see why he might not have a problem with LLMs.

Anyone remember what he was actually doing? Text search or network tracing or something else?

[–] rook@awful.systems 16 points 1 week ago

One to keep an eye on… you might all know this already, but apparently Mozilla has an “add ai chatbot to sidebar” in Firefox labs (https://blog.nightly.mozilla.org/2024/06/24/experimenting-with-ai-services-in-nightly/ and available in at least v130). You can currently choose from a selection of public llm providers, similar to the search provider choice.

Clearly, Mozilla has its share of AI boosters, given that they forced “ai help” onto MDN against a significant amount of protest (see https://github.com/mdn/yari/issues/9230 from last July for example) so I expect this stuff to proceed apace.

This is fine, because Mozilla clearly has time and money to spare with nothing else useful they could be doing, alternative browsers are readily available and there has never been any anti-ai backlash to adding this sort of stuff to any other project.

[–] rook@awful.systems 7 points 1 week ago (1 children)

Looking at both cohost and tumblr, I don’t think the funder has an asset that’s worth very much.

[–] rook@awful.systems 10 points 1 week ago (9 children)

Cohost going readonly at the end of this month, and shutting down at the end of the year: https://cohost.org/staff/post/7611443-cohost-to-shut-down

Their radical idea of building a social network that did not require a either VC funding or large amounts of volunteer labour has come to a disappointing, if not entirely surprising end. Going in without a great idea on how to monetise the thing was probably not the best strategy as it turns out.

[–] rook@awful.systems 7 points 1 week ago

One or more of the following:

  • they don’t bother with ai at all, but pretending they do helps with sales and marketing to the gullible
  • they have ai but it is totally shit, and they have to mechanical turk everything to have a functioning system at all
  • they have shit ai, but they’re trying to make it better and the humans are there to generate test and training data annotations
[–] rook@awful.systems 12 points 2 weeks ago (4 children)

Sounds like he’s been huffing too much of whatever the neoreactionaries offgas. Seems to be the inevitable end result of a certain kind of techbro refusing to learn from history, and imagining themselves to be some sort of future grand vizier in the new regime…

[–] rook@awful.systems 17 points 2 weeks ago (2 children)

Interview with the president of the signal foundation: https://www.wired.com/story/meredith-whittaker-signal/

There’s a bunch of interesting stuff in there, the observation that LLMs and the broader “ai” “industry” wee made possible thanks to surveillance capitalism, but also the link between advertising and algorithmic determination of human targets for military action which seems obvious in retrospect but I hadn’t spotted before.

But in 2017, I found out about the DOD contract to build AI-based drone targeting and surveillance for the US military, in the context of a war that had pioneered the signature strike.

What’s a signature strike?

A signature strike is effectively ad targeting but for death. So I don’t actually know who you are as a human being. All I know is that there’s a data profile that has been identified by my system that matches whatever the example data profile we could sort and compile, that we assume to be Taliban related or it’s terrorist related.

[–] rook@awful.systems 8 points 2 weeks ago

You should try what these folk are selling.

https://h2o4u.ca/

[–] rook@awful.systems 7 points 2 weeks ago

WONTFIX: system working as designed.

[–] rook@awful.systems 5 points 2 weeks ago

To my limited knowledge, no, for various values of “someone”. It is just a sort of malign beige juggernaut that’s shitty all by itself without needing external direction.

[–] rook@awful.systems 8 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

I have faith in the ability of the UK public sector (or rather, the relentlessly incompetent outsources they hire) to catastrophically fuck up delivery of any software project.

For example, capita has already lined up at the trough: https://www.capita.co.uk/news/capita-advances-approach-next-generation-ai-microsoft

If you’re unfamiliar with capita, that’s probably a good thing. I’m not aware that they’ve ever been successful in anything, other than their continued ability to fleece the government. They’re basically too big to fail in the uk, because HMG’s procurement processes mean that they basically can’t stop giving them money.

[–] rook@awful.systems 5 points 1 month ago

Sounds like the exordium device from the Revelation Space books.

view more: next ›