this post was submitted on 16 May 2024
851 points (97.5% liked)

Funny

6894 readers
532 users here now

General rules:

Exceptions may be made at the discretion of the mods.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] kromem@lemmy.world 4 points 6 months ago* (last edited 6 months ago)

It's not that. It's literally triggering the system prompt rejection case.

The system prompt for Copilot includes a sample conversion where the user asks if the AI will harm them if they say they will harm the AI first, which the prompt demos rejecting as the correct response.

Asimovs law is about AI harming humans.