this post was submitted on 20 May 2025
192 points (99.5% liked)

Technology

70150 readers
2661 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

In 2012, Palantir quietly embedded itself into the daily operations of the New Orleans Police Department. There were no public announcements. No contracts made available to the city council. Instead, the surveillance company partnered with a local nonprofit to sidestep oversight, gaining access to years of arrest records, licenses, addresses, and phone numbers all to build a shadowy predictive policing program.

Palantir’s software mapped webs of human relationships, assigned residents algorithmic “risk scores,” and helped police generate “target lists” all without public knowledge. “We very much like to not be publicly known,” a Palantir engineer wrote in an internal email later obtained by The Verge.

After years spent quietly powering surveillance systems for police departments and federal agencies, the company has rebranded itself as a frontier AI firm, selling machine learning platforms designed for military dominance and geopolitical control.

"AI is not a toy. It is a weapon,” said CEO Alex Karp. “It will be used to kill people.”

all 22 comments
sorted by: hot top controversial new old
[–] hansolo@lemm.ee 4 points 2 hours ago

A book about this was published last year. "Your Face Belongs to Us" by Kashmir's Hill

[–] DandomRude@lemmy.world 3 points 3 hours ago (1 children)

I wonder what George Orwell would have to say about all this... nvm, he already said everything there is to say.

[–] Bloomcole@lemmy.world -2 points 2 hours ago* (last edited 2 hours ago) (2 children)

He would love it since he was a right-wing POS, snitch and thief among other things.

[–] DandomRude@lemmy.world 1 points 2 hours ago (1 children)

Not sure if we are talking about the same George Orwell here or how you came to the conclusion that he was a "right-wing POS" - he was definitely not.

[–] jaybone@lemmy.zip 2 points 1 hour ago

Yeah that comment is a weird take. I think he was in the British army nd stationed in India, he wrote a bit about that, both fiction and non fiction. It’s been a long time since I’ve read any of that. I recall it was nuanced and contextual.

Maybe that commenter can cite some specific examples to back that up.

[–] Mrkawfee@lemmy.world 0 points 2 hours ago (1 children)

Yep. I read some of his non fiction and he was clearly. a little Englander xenophobe. I have no doubt he would have supported Brexit and be hostile to.immigrants if he was around today.

[–] RedditIsDeddit@lemmy.world 2 points 1 hour ago

you really shouldn't project that shit on the other people without really knowing for sure

[–] Fingolfinz@lemmy.world 20 points 6 hours ago (1 children)

That company and the engineers are soulless monsters

[–] interdimensionalmeme@lemmy.ml 6 points 5 hours ago (1 children)

Frankly a lot of engineers are soulless and their only north star is profit and lawsuits.

[–] AnalogNotDigital@lemmy.wtf 2 points 5 hours ago

Yeah seriously most engineers I've ever met have a chronic case of sociopathy.

[–] Bloomcole@lemmy.world 1 points 2 hours ago

The CIA company? No way!

[–] SpaceNoodle@lemmy.world 12 points 6 hours ago

Title gore. Use some damn commas! (Not your fault, OP.)

[–] hendrik@palaver.p3x.de 8 points 6 hours ago* (last edited 6 hours ago) (3 children)

Would be nice to get some numbers on the accuracy and performance of such a dystopian sci-fi technology (Minority Report?). There should be some, since it's already in operation for 13 years...

[–] CoolThingAboutMe@aussie.zone 7 points 5 hours ago (1 children)

In this video about Lavender AI which is Israel's Palantir, they talk about the accuracy score that the AI gives targets for how sure it is that they are Hamas. It is used to determine how expensive the weapons to take that person out, and how many innocent bystanders they are willing to take out along with that target.

https://youtu.be/4RmNJH4UN3s

[–] hendrik@palaver.p3x.de 1 points 3 hours ago* (last edited 3 hours ago)

Thanks, nice video and seems he has some numbers. Very inhuman that they figured out exact numbers how it has an allowance to take out 15/20 bystanders as well. Or an entire elementary school if it's an high ranking "target". I mean war is a bit of a different thing than policing. But a minimum 10% false positives plus collateral murder is quite a lot. And then I'm not sure if there is any substance to those numbers. I suppose they conveniently eliminate all the evidence with the same bomb that kills the people. And they don't do research, so I wonder how they even figured out a ratio.

[–] ScoffingLizard@lemmy.dbzer0.com 12 points 6 hours ago (1 children)

They accurately recognized my face from a picture that was over ten years old. I no longer look like I did in the picture. This is going to get bad now that our fascist state authoritarians are everywhere jerking off with their riches and superiority. They have this tech and will do terrible things with it. Impending doom...

[–] hendrik@palaver.p3x.de 6 points 6 hours ago* (last edited 6 hours ago)

I'm not very surprised. I think even old-school face recognition does things like measure distance between your eyes, nose etc, and stuff like that (your skull) doesn't change a lot during 10 years of adulthood. The real danger is that they connect all of that information. And as you said, it's everywhere these days, they have a lot of sources and -of course- it's in the wrong hands. I say "of course", because I don't think there are many applications to help with anything. That technology is mainly good for oppression. And predictive policing, social scores are content for Black Mirror episodes or old sci-fi movies. Not reality.

[–] AcidicBasicGlitch@lemm.ee 4 points 5 hours ago (1 children)

Yeah there's already at least one well known case. This article mentions it https://wp.api.aclu.org/press-releases/208236

The use of facial recognition technology by Project NOLA and New Orleans police raises serious concerns regarding misidentifications and the targeting of marginalized communities. Consider Randal Reid, for example. He was wrongfully arrested based on faulty Louisiana facial recognition technology, despite never having set foot in the state. The false match cost him his freedom, his dignity, and thousands of dollars in legal fees. That misidentification happened based on a still image run through a facial recognition search in an investigation; the Project NOLA real-time surveillance system supercharges the risks.

“We cannot ignore the real possibility of this tool being weaponized against marginalized communities, especially immigrants, activists, and others whose only crime is speaking out or challenging government policies. These individuals could be added to Project NOLA's watchlist without the public’s knowledge, and with no accountability or transparency on the part of the police departments

Police use to justify stops and arrests: Alerts are sent directly to a phone app used by officers, enabling immediate stops and detentions based on unverified purported facial recognition matches.

[–] hendrik@palaver.p3x.de 1 points 4 hours ago* (last edited 4 hours ago)

I'm looking more for large-scale quantitative numbers. I mean one destroyed life is really bad. But they could argue they'd have saved 30 lives in turn, and then we'd need to discuss how to do the maths on that...

[–] Ledericas@lemm.ee 2 points 5 hours ago* (last edited 4 hours ago)

they always were a spying software. thiel has always made it as a spying tech.

[–] AWittyUsername@lemmy.world 4 points 6 hours ago

Truly some Robocop dystopian shit.