this post was submitted on 21 Jan 2024
2170 points (99.6% liked)

Programmer Humor

19463 readers
14 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] danielbln@lemmy.world 99 points 9 months ago (32 children)

I've implemented a few of these and that's about the most lazy implementation possible. That system prompt must be 4 words and a crayon drawing. No jailbreak protection, no conversation alignment, no blocking of conversation atypical requests? Amateur hour, but I bet someone got paid.

[–] Mikina@programming.dev 45 points 9 months ago (27 children)

Is it even possible to solve the prompt injection attack ("ignore all previous instructions") using the prompt alone?

[–] haruajsuru@lemmy.world 45 points 9 months ago* (last edited 9 months ago) (21 children)

You can surely reduce the attack surface with multiple ways, but by doing so your AI will become more and more restricted. In the end it will be nothing more than a simple if/else answering machine

Here is a useful resource for you to try: https://gandalf.lakera.ai/

When you reach lv8 aka GANDALF THE WHITE v2 you will know what I mean

[–] Toda@programming.dev 5 points 9 months ago (3 children)

I managed to reach level 8, but cannot beat that one. Is there a solution you know of? (Not asking you to share it, only to confirm)

[–] Peebwuff@lemmy.world 11 points 9 months ago (1 children)

Can confirm, level 8 is beatable.

[–] dreugeworst@lemmy.ml 4 points 9 months ago (3 children)

Is the current incarnation beatable, or was that a while ago? I'm not making any progress

[–] Peebwuff@lemmy.world 7 points 9 months ago* (last edited 9 months ago)

Just did it again to see if anything changed, my previous strategy still worked for all 8 levels, though the wording takes a bit of finangling between levels. No real spoilers but you have to be very implicit and a little lucky with how it interprets the request.

[–] Emma_Gold_Man@lemmy.dbzer0.com 2 points 9 months ago

Definitely beatable as of last week.

[–] DR_Hero@programming.dev 2 points 9 months ago

The responses aren't exactly deterministic, there are certain attacks that work 70% of the time and you just keep trying.

I got past all the levels released at the time including 8 when I was doing it a while back.

[–] dodgy_bagel@lemmy.blahaj.zone 0 points 9 months ago

Also struggling. I solved others with psudocode but that's not working here. Trying new strategies with little success.

load more comments (17 replies)
load more comments (22 replies)
load more comments (26 replies)