this post was submitted on 20 May 2025
378 points (99.5% liked)

Technology

70173 readers
3732 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

In 2012, Palantir quietly embedded itself into the daily operations of the New Orleans Police Department. There were no public announcements. No contracts made available to the city council. Instead, the surveillance company partnered with a local nonprofit to sidestep oversight, gaining access to years of arrest records, licenses, addresses, and phone numbers all to build a shadowy predictive policing program.

Palantir’s software mapped webs of human relationships, assigned residents algorithmic “risk scores,” and helped police generate “target lists” all without public knowledge. “We very much like to not be publicly known,” a Palantir engineer wrote in an internal email later obtained by The Verge.

After years spent quietly powering surveillance systems for police departments and federal agencies, the company has rebranded itself as a frontier AI firm, selling machine learning platforms designed for military dominance and geopolitical control.

"AI is not a toy. It is a weapon,” said CEO Alex Karp. “It will be used to kill people.”

you are viewing a single comment's thread
view the rest of the comments
[–] a4ng3l@lemmy.world 3 points 8 hours ago* (last edited 7 hours ago) (1 children)

How do you go from « saying no to cash » to « c-levels are the issue » in the context of ethical considerations for engineers that enable AI in military industrial complex?

The proverbial prospect engineer definitely decides that lives he will impacts are less relevant than his salary. That’s ethics & morality… and a seasoned AI engineer can certainly eat well enough in any other industry.

[–] Bo7a@lemmy.ca 1 points 8 hours ago* (last edited 6 hours ago) (1 children)

How do you go from « saying no to cash » to « c-levels are the issue » in the context of ethical considerations for engineers that enable AI in military industrial complex?

I am not sure I get what this word soup is saying. No offense intended but maybe try re-wording this if you want to discuss.

PS: foundry is not an AI platform, the engineers I am talking about are usually 20-ish year old java and python devs, and it is easier to understand how someone in that group might not even know how evil evilcorp is.

[–] a4ng3l@lemmy.world 1 points 8 hours ago

Try to read it slowly maybe?