Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have prompt-injected myself before by having a model accidentally read a stored library of prompts and get totally confused by it. It took me a hot minute to trace, and that was a 'friendly' accident.

I can think of a few NPM libraries where an embedded prompt could do a lot of damage for future iterations.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: