this post was submitted on 07 Jun 2024
1200 points (92.7% liked)

Programmer Humor

32495 readers
356 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[โ€“] eestileib@sh.itjust.works 29 points 5 months ago (1 children)

LLM system input is unsanitizable, according to NVidia:

The control-data plane confusion inherent in current LLMs means that prompt injection attacks are common, cannot be effectively mitigated, and enable malicious users to take control of the LLM and force it to produce arbitrary malicious outputs with a very high likelihood of success.

https://developer.nvidia.com/blog/securing-llm-systems-against-prompt-injection/

[โ€“] MalReynolds@slrpnk.net 2 points 5 months ago

Everything old is new again (GIGO)