It hasn’t, OpenAI has simply tuned GPT-4’s response to match the caliber and quality of the prompt.
GPT-4 will only give good responses if the prompt shows deep understanding and thinking. This is how they’re making it a personalized tool for all.
It hasn’t, OpenAI has simply tuned GPT-4’s response to match the caliber and quality of the prompt.
GPT-4 will only give good responses if the prompt shows deep understanding and thinking. This is how they’re making it a personalized tool for all.
Imo, and what I have read and understand.
Did you miss a /s in there?
Wait… they’re making artificial intelligence match the intelligence of the person using it? 
I guess I gotta upgrade the quality of my prompts. Kinda defeats the purpose, no?
It kind of makes sense to me. For instance, if a regular person asks it about quantum mechanics, and it starts providing them with complex astrophysicist level mathematical equations of quantum mechanics, the result will be completely irrelevant to them.
I agree actually, it kinda does make sense
“Meet them where they’re at” sorta deal
With answers it could provide some path to more complicated topics but with code it won’t even try.
I used to get it to spit out some advanced JavaScript to do all sorts of 3js things and now it falls back to “well, that’s complicated”
Yeah, it should definitely provide more paths to get a more comprehensive answer. One thing I like about Bing and Google bard is that they point you to the resources. It's constantly changing tho, that's true.
I just tested, you are correct. It provides better code only when you explicitly instruct it on how to create a function in pseudocode. That's dumb and backwards.
My guess is this stuff will be segmented and sold separately to different audiences. Developers will get a developer tuned model that probably costs more than the generic version and will do a better job at interpreting dev tasks.
But who knows, maybe I’m wrong.
I mean, that makes sense. Developers are definitely the power users, and they may also offer some more developer-specific tools for that plan.
never paid for GPT4, only used 3. but:
those NN use heavly the input to generate output. In the paper "attention is all you need" you see that each word of the output depend on the previous words of the output AND on the whole input.
A smart, accurate, logically and grammatically sound input with precise vocabulary will make it easier for the NN to generate a coherent output with correct jargon.
This do not rule out any bias, filter, or nasty stuff. But it is not magic: garbage input, garbage output.