Well said, Jameson.
But... I think we need to go back (down?) a step first and answer what is natural intelligence?
One could argue that human intelligence is also the result of lossy encoding of OUR inputs via our sensors and we are also parroting our experiences. Even cognitive deductions, at a lower level, may just be outputs of complex algorithims being applied to that same input layer.
Additionally, when it comes to LLMs, once the parrot's vocabulary and ability to generalize exceeds our comprehension, then perhaps the label is just semantics.
It's a fascinating subject!