this post was submitted on 20 May 2025
315 points (99.7% liked)

Technology

70150 readers
3255 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

In 2012, Palantir quietly embedded itself into the daily operations of the New Orleans Police Department. There were no public announcements. No contracts made available to the city council. Instead, the surveillance company partnered with a local nonprofit to sidestep oversight, gaining access to years of arrest records, licenses, addresses, and phone numbers all to build a shadowy predictive policing program.

Palantir’s software mapped webs of human relationships, assigned residents algorithmic “risk scores,” and helped police generate “target lists” all without public knowledge. “We very much like to not be publicly known,” a Palantir engineer wrote in an internal email later obtained by The Verge.

After years spent quietly powering surveillance systems for police departments and federal agencies, the company has rebranded itself as a frontier AI firm, selling machine learning platforms designed for military dominance and geopolitical control.

"AI is not a toy. It is a weapon,” said CEO Alex Karp. “It will be used to kill people.”

top 50 comments
sorted by: hot top controversial new old
[–] Bloomcole@lemmy.world 1 points 22 minutes ago
[–] Gammelfisch@lemmy.world 4 points 5 hours ago (1 children)

The battlefields in Ukraine are Palintir's live fire laboratories. I wonder if they told Krasnov to STFU about denying or restricting military aid to Ukraine.

[–] Bloomcole@lemmy.world -1 points 22 minutes ago
[–] DandomRude@lemmy.world 12 points 10 hours ago (20 children)

I wonder what George Orwell would have to say about all this... nvm, he already said everything there is to say.

load more comments (20 replies)
[–] Fingolfinz@lemmy.world 36 points 13 hours ago (2 children)

That company and the engineers are soulless monsters

[–] Bo7a@lemmy.ca 7 points 6 hours ago* (last edited 6 hours ago) (2 children)

Devil's Advocate (damn near literally this time around)

Try being a young engineer at the top of your game and saying no to an offer where the yearly salary makes google engineers jealous. Not everyone can say no.

Palantir offers like 400k/year to run-of-the-mill forward deployed engineers for foundry (Civilian platform) where the job is 99% actually helping customers with interesting engineering problems.

I can't even imagine what they are offering folks working on gotham (govt/military side.)

I guess what I'm trying to say is that while a ton of those engineers are soulless sociopaths, some of them just took a job that pays super well and they don't personally align with the goals of the C-levels. And in fact - a ton won't even know what those goals are.

Remember - Our enemy is the c-suite, not the level 1 support agent. Even at evilcorp. Thankfully I am in a position where my kids are grown up and the money treadmill isn't set on hardmode for me anymore. I can say no. But even for me it is sometimes difficult.

[–] Fingolfinz@lemmy.world 3 points 2 hours ago (1 children)

Yeah well I don’t really give a fuck. I never became a class traitor for money and don’t forgive anyone who has

[–] Bo7a@lemmy.ca 1 points 42 minutes ago (1 children)

I haven't either. But I can see how a young person without a lot of knowledge of the world and the impending weight of 50 years of work ahead of them, possibly with a family to feed, or an extended family to take care of due to the inherently predatory healthcare system where they were born, might make that choice. And I understand it, regardless of "forgiveness".

[–] Fingolfinz@lemmy.world 2 points 36 minutes ago (2 children)

We are failing the youth and need to completely change our society. I get what you’re saying and see why it happens but it’s the result of our fucked up society and capitalism. People coming out of school shouldn’t have to face these ethical dilemmas because of the system they’re subjected to and if everyone doesn’t get on board then we’re just fucked and all generations after will be more fucked

[–] Bo7a@lemmy.ca 1 points 5 minutes ago

I would bet we have a lot of the exact same thoughts on why this happens, and probably how to solve it. My only disagreement - and it is not a strong one - is the impossibility of forgiveness.

If they mature and leave, or even better, commit to being a monkey wrench for a bit before leaving... I think I can find space for them in my community.

[–] Eyedust@lemmy.dbzer0.com 2 points 9 minutes ago

Yeah, but what's more feasible? Uniting a complacent society and not knowing when or where your next meal will be, or taking a hot check home and living comfortably? Especially when kids come into the mix. Why do you think they want to push the "have kids" and anti-abortion agenda? Because you're only going to think about the best for your family and the best is stability and peace when it comes to children.

I don't disagree with you, but I don't condemn the little people trying to survive, either.

[–] a4ng3l@lemmy.world 2 points 4 hours ago* (last edited 2 hours ago) (1 children)

How do you go from « saying no to cash » to « c-levels are the issue » in the context of ethical considerations for engineers that enable AI in military industrial complex?

The proverbial prospect engineer definitely decides that lives he will impacts are less relevant than his salary. That’s ethics & morality… and a seasoned AI engineer can certainly eat well enough in any other industry.

[–] Bo7a@lemmy.ca 1 points 3 hours ago* (last edited 1 hour ago) (1 children)

How do you go from « saying no to cash » to « c-levels are the issue » in the context of ethical considerations for engineers that enable AI in military industrial complex?

I am not sure I get what this word soup is saying. No offense intended but maybe try re-wording this if you want to discuss.

PS: foundry is not an AI platform, the engineers I am talking about are usually 20-ish year old java and python devs, and it is easier to understand how someone in that group might not even know how evil evilcorp is.

[–] a4ng3l@lemmy.world 0 points 3 hours ago

Try to read it slowly maybe?

[–] interdimensionalmeme@lemmy.ml 11 points 13 hours ago (1 children)

Frankly a lot of engineers are soulless and their only north star is profit and lawsuits.

[–] AnalogNotDigital@lemmy.wtf 5 points 12 hours ago

Yeah seriously most engineers I've ever met have a chronic case of sociopathy.

[–] hansolo@lemm.ee 7 points 9 hours ago

A book about this was published last year. "Your Face Belongs to Us" by Kashmir's Hill

[–] SpaceNoodle@lemmy.world 12 points 14 hours ago

Title gore. Use some damn commas! (Not your fault, OP.)

[–] hendrik@palaver.p3x.de 9 points 13 hours ago* (last edited 13 hours ago) (3 children)

Would be nice to get some numbers on the accuracy and performance of such a dystopian sci-fi technology (Minority Report?). There should be some, since it's already in operation for 13 years...

[–] ScoffingLizard@lemmy.dbzer0.com 14 points 13 hours ago (1 children)

They accurately recognized my face from a picture that was over ten years old. I no longer look like I did in the picture. This is going to get bad now that our fascist state authoritarians are everywhere jerking off with their riches and superiority. They have this tech and will do terrible things with it. Impending doom...

[–] hendrik@palaver.p3x.de 7 points 13 hours ago* (last edited 13 hours ago)

I'm not very surprised. I think even old-school face recognition does things like measure distance between your eyes, nose etc, and stuff like that (your skull) doesn't change a lot during 10 years of adulthood. The real danger is that they connect all of that information. And as you said, it's everywhere these days, they have a lot of sources and -of course- it's in the wrong hands. I say "of course", because I don't think there are many applications to help with anything. That technology is mainly good for oppression. And predictive policing, social scores are content for Black Mirror episodes or old sci-fi movies. Not reality.

[–] CoolThingAboutMe@aussie.zone 8 points 12 hours ago (1 children)

In this video about Lavender AI which is Israel's Palantir, they talk about the accuracy score that the AI gives targets for how sure it is that they are Hamas. It is used to determine how expensive the weapons to take that person out, and how many innocent bystanders they are willing to take out along with that target.

https://youtu.be/4RmNJH4UN3s

[–] hendrik@palaver.p3x.de 2 points 11 hours ago* (last edited 10 hours ago)

Thanks, nice video and seems he has some numbers. Very inhuman that they figured out exact numbers how it has an allowance to take out 15/20 bystanders as well. Or an entire elementary school if it's an high ranking "target". I mean war is a bit of a different thing than policing. But a minimum 10% false positives plus collateral murder is quite a lot. And then I'm not sure if there is any substance to those numbers. I suppose they conveniently eliminate all the evidence with the same bomb that kills the people. And they don't do research, so I wonder how they even figured out a ratio.

[–] AcidicBasicGlitch@lemm.ee 5 points 12 hours ago (1 children)

Yeah there's already at least one well known case. This article mentions it https://wp.api.aclu.org/press-releases/208236

The use of facial recognition technology by Project NOLA and New Orleans police raises serious concerns regarding misidentifications and the targeting of marginalized communities. Consider Randal Reid, for example. He was wrongfully arrested based on faulty Louisiana facial recognition technology, despite never having set foot in the state. The false match cost him his freedom, his dignity, and thousands of dollars in legal fees. That misidentification happened based on a still image run through a facial recognition search in an investigation; the Project NOLA real-time surveillance system supercharges the risks.

“We cannot ignore the real possibility of this tool being weaponized against marginalized communities, especially immigrants, activists, and others whose only crime is speaking out or challenging government policies. These individuals could be added to Project NOLA's watchlist without the public’s knowledge, and with no accountability or transparency on the part of the police departments

Police use to justify stops and arrests: Alerts are sent directly to a phone app used by officers, enabling immediate stops and detentions based on unverified purported facial recognition matches.

[–] hendrik@palaver.p3x.de 1 points 11 hours ago* (last edited 11 hours ago) (1 children)

I'm looking more for large-scale quantitative numbers. I mean one destroyed life is really bad. But they could argue they'd have saved 30 lives in turn, and then we'd need to discuss how to do the maths on that...

[–] AcidicBasicGlitch@lemm.ee 2 points 6 hours ago (1 children)

Yeah, I am not sure but hopefully somebody has the numbers. That is usually the argument for trampling the constitution.

Hard to say how much has been actually documented bc they've been doing a lot of this stuff off record.

They potentially saved 30000 lives locking up 100 people for crimes committed by somebody else in a states they've never been to and we might not even know about it

[–] hendrik@palaver.p3x.de 1 points 5 hours ago* (last edited 5 hours ago)

Oh well, some people in the USA have a really "interesting" relationship with their own constitution these days... I sometimes feel like explaining it to them. Or what a constitutional republic is.

Yeah, keeping things "off record" is the usual strategy to get away with whatever you wanted.

[–] AWittyUsername@lemmy.world 6 points 13 hours ago

Truly some Robocop dystopian shit.

[–] Bloomcole@lemmy.world 1 points 10 hours ago (1 children)
[–] AcidicBasicGlitch@lemm.ee 0 points 6 hours ago

Fuckin A Man, CIA man

Every time I hear that, I picture some guy stopping somebody else mid sentence when it comes on to be like "Wait, wait... you hear that? That's me!" And then blasting it at full volume while he sings along. I feel like that's had to have happened at least once or twice.

TBF it is a very catchy song

[–] Ledericas@lemm.ee 2 points 12 hours ago* (last edited 11 hours ago)

they always were a spying software. thiel has always made it as a spying tech.

load more comments
view more: next ›