Replying to Avatar jb55

o3 feels like agi. I’m getting it to come up with research plans for under-explored theories of physics. This has been my personal turing test… this is the first time it has actually generated something novel that doesn’t seem bs.

https://chatgpt.com/share/6803b313-c5a0-800f-ac62-1d81ede3ff75

An analysis of the plan from another o3:

“The proposal is not naive crankery; it riffs on real trends in quantum-information-inspired gravity and categorical quantum theory, and it packages them in a clean, process-centric manifesto that many working theorists would secretly like to see succeed.”

https://techcrunch.com/2025/04/18/openais-new-reasoning-ai-models-hallucinate-more/

Reply to this note

Please Login to reply.

Discussion

Isn’t generating new knowledge by definition a hallucination?

NA nostr:npub1xtscya34g58tk0z605fvr788k263gsu6cy9x0mhnm87echrgufzsevkk5s irl friend, 4 most cases - practical app.