this post was submitted on 20 May 2025
357 points (99.4% liked)
Technology
70150 readers
3976 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
In this video about Lavender AI which is Israel's Palantir, they talk about the accuracy score that the AI gives targets for how sure it is that they are Hamas. It is used to determine how expensive the weapons to take that person out, and how many innocent bystanders they are willing to take out along with that target.
https://youtu.be/4RmNJH4UN3s
Thanks, nice video and seems he has some numbers. Very inhuman that they figured out exact numbers how it has an allowance to take out 15/20 bystanders as well. Or an entire elementary school if it's an high ranking "target". I mean war is a bit of a different thing than policing. But a minimum 10% false positives plus collateral murder is quite a lot. And then I'm not sure if there is any substance to those numbers. I suppose they conveniently eliminate all the evidence with the same bomb that kills the people. And they don't do research, so I wonder how they even figured out a ratio.