Global Feed Post Login
Replying to Avatar Juspace

I just tried the new Bing GPT-4 based chatbot. It is so badly screwed up that it is hard to understand.

I asked about US banks reserve requirements and it confidently answered: 10% or 3% if certain conditions apply.

I pointed out the truth that the requirement was dropped to 0% at 2020 whem covid hit. Bing confirmed this, but did not want to aknowledge that it provided false info at first.

Then I asked if the 0 reserve requirement made bank runs more feasible. It brought up SVB.

This was how it explained the SVB default: (Note! Totally flase info here.)

Also, the "references" 1) cnbc, 2) latimes 3)washington post, did not contain anything that actually backed this story.

Delibrately misleading or just a really bad AI?

Avatar
Zach⚡️ 2y ago

I do think this is concerning, just because so people worship what these AIs say and aren’t able to think for themselves.

Reply to this note

Please Login to reply.

Discussion

No replies yet.