Replying to Avatar ch0k1

DeepSeek Jailbreak Reveals Its Entire System Prompt

https://www.darkreading.com/application-security/deepseek-jailbreak-system-prompt

Researchers have tricked DeepSeek, the Chinese generative AI (GenAI) that debuted earlier this month to a whirlwind of publicity and user adoption, into revealing the instructions that define how it operates.

originally posted at https://stacker.news/items/873988

I was so ashamed for my post being in the lightning storm news letter when it had nothing to do with the lightning network lmfao (alcoholnacetone)

Reply to this note

Please Login to reply.

Discussion

No replies yet.