It actually makes stuff up to, it’s called AI hallucination. It “predicts” based on what it thinks human behaviour might be/do. But once you ask it for sources and studies it can’t provide anything and will say “there aren’t actual studies”

Reply to this note

Please Login to reply.

Discussion

No replies yet.