Jagged Intelligence
The word I came up with to describe the (strange, unintuitive) fact that state of the art LLMs can both perform extremely impressive tasks (e.g. solve complex math problems) while simultaneously struggle with some very dumb problems.
E.g. example from two days ago - which number is bigger, 9.11 or 9.9? Wrong.
https://x.com/karpathy/status/1815549255354089752…
or failing to play tic-tac-toe: making non-sensical decisions:
https://x.com/polynoamial/status/1755717284650176591…
or another common example, failing to count, e.g. the number of times the letter "r" occurs in the word "barrier", ChatGPT-4o claims it's 2:
https://x.com/karpathy/status/1816160802765955186…
The same is true in other modalities. State of the art LLMs can reasonably identify thousands of species of dogs or flowers, but e.g. can't tell if two circles overlap:
https://x.com/fly51fly/status/1812599708134916218…
Jagged Intelligence. Some things work extremely well (by human standards) while some things fail catastrophically (again by human standards), and it's not always obvious which is which, though you can develop a bit of intuition over time. Different from humans, where a lot of knowledge and problem solving capabilities are all highly correlated and improve linearly all together, from birth to adulthood.
Personally I think these are not fundamental issues. They demand more work across the stack, including not just scaling. The big one I think is the present lack of "cognitive self-knowledge", which requires more sophisticated approaches in model post-training instead of the naive "imitate human labelers and make it big" solutions that have mostly gotten us this far. For an example of what I'm talking about, see Llama 3.1 paper section on mitigating hallucinations:
https://x.com/karpathy/status/1816171241809797335…
For now, this is something to be aware of, especially in production settings. Use LLMs for the tasks they are good at but be on a lookout for jagged edges, and keep a human in the loop.

Source: x.com/karpathy/status/1816531576228053133