Amazing that it will actually acknowledge that. I have very little knowledge about the intricacies of llm's but it should be possible to tell you the amount of confidence it has in its anwer.
Moving forward from that it would be great if it could simply say "I don't know".
