DeepSeek Jailbreak Reveals Its Entire System Prompt

https://www.darkreading.com/application-security/deepseek-jailbreak-system-prompt

Researchers have tricked DeepSeek, the Chinese generative AI (GenAI) that debuted earlier this month to a whirlwind of publicity and user adoption, into revealing the instructions that define how it operates.

originally posted at https://stacker.news/items/873988

Reply to this note

Please Login to reply.

Discussion

I was so ashamed for my post being in the lightning storm news letter when it had nothing to do with the lightning network lmfao (alcoholnacetone)