Global Feed Post Login
Replying to Avatar Juspace

I just tried the new Bing GPT-4 based chatbot. It is so badly screwed up that it is hard to understand.

I asked about US banks reserve requirements and it confidently answered: 10% or 3% if certain conditions apply.

I pointed out the truth that the requirement was dropped to 0% at 2020 whem covid hit. Bing confirmed this, but did not want to aknowledge that it provided false info at first.

Then I asked if the 0 reserve requirement made bank runs more feasible. It brought up SVB.

This was how it explained the SVB default: (Note! Totally flase info here.)

Also, the "references" 1) cnbc, 2) latimes 3)washington post, did not contain anything that actually backed this story.

Delibrately misleading or just a really bad AI?

Avatar
⚡️🌱🌙 2y ago

Really bad compared to what exactly?

Now try the same test with an average human and you will be told “Ma’am this is a Wendy’s”

Reply to this note

Please Login to reply.

Discussion

Avatar
Juspace 2y ago

Well, at least the wendy's cashier will not give you plastic toy potetoes when you ask for a hamburger. And when you then complain, fail to see that there is no hamburger, and explain that the potatoes are also a good example to show you what a food generally is like.

Thread collapsed