Yes but I’m also not convinced it’s completely solvable. The nature of how LLMs work make them potentially impossible to “secure” from many different attacks.
It‘s like trying to tell a many-faced die it should show certain values.
Please Login to reply.
No replies yet.