LLM Attacks Take Just 42 Seconds On Average, 20% of Jailbreaks Succeed - spatwei shared an article from SC World:

Attacks on large language models (LLMs) ... - https://it.slashdot.org/story/24/10/12/213247/llm-attacks-take-just-42-seconds-on-average-20-of-jailbreaks-succeed?utm_source=rss1.0mainlinkanon&utm_medium=feed #ai

Reply to this note

Please Login to reply.

Discussion

No replies yet.