It came up with a very detailed answer of something that isn’t true and never happened. I guess it can’t lie 🤔 but it’s definitely not working right
Discussion
If there was context then I could say why it responded the way it did.
But more than likely it took the first data point it could find.
This occurred when I was researching IQ differences between Republicans, and Democrats with Brave AI.
The intial response was democrats were more intelligent because of higher college attendance but then when I did the query again it said Republicans since it referenced Carl 2014.
AI can only work with the information it has available to it.
Rather long to explain but I asked it if a specific person mentioned a book and it gave me specific sources and explained what the person said about the book. When I asked for more info cos I couldn’t find what the person said from the first source, it apologised and said the person had actually never mentioned the book, but they like similar topics.
I do like AI, but she’s not wrong, it’s a Bullshit engine 🙏🏽 hit or miss - for me. Predicting and assuming and presenting it as fact is odd. I don’t like that it does that. Some things are accurate but I wouldn’t use it for anything important and I’d double check any facts I get from it for sure.
Which is the problem with AI because it can only work from information available on the internet.