Great article. Something I thought was super interesting about this is that LLMs can and will hallucinate tool usage, chain of thought, and “reasoning” explanations.
I know MCP is the hottest new thing, but it may not be as big a game changer as some believe.
https://www.mindprison.cc/p/no-progress-toward-agi-llm-braindead-unreliable