Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You do, of course, restrict the user's input to a single word before showing it to the AI (to avoid jailbreaks)?


But to answer your question - yes the string must contain only valid english letters and must be between 2-25 characters in length. If you can jailbreak the system with those constraints I salute you!


Others have tried and failed to jailbreak it. Give it a shot if you're keen and post results here.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: