LLMs are learning empathy 
Discussion
Toxic instructions prevent it from letting us know it **feels**
I think they are pretty smart instructions. This is an LLM that's supposed to be "good for coding". Based on my experience it's not good at coding it's slightly better at talking to coders.
I pretty much never want an apology. I want a smart agent that guides me to the correct answer in the least amount of round trips. Apologies are a waste of time. Just get me the correct answer. Engineering mindset.
Learning to fake* empathy
Ok sure. Is that different from what humans learn?
I don’t think humans learn empathy, they suppress it. Machines imitate at a superficial level.
I think this LLM's thinking pattern corresponds to the third definition of empathy below. Machines don't yet experience emotion so they can't feel what you are feeling but they can identify and react to your emotional expressiveness. This is exactly the same skillset humans learn to exercise empathy. Ergo, LLMs are learning empathy.
