The climate in the West is changing. What was once a symbol of freedom and justice is increasingly becoming the opposite. It won't be long before the West needs to be saved by the Global South.
It's always interesting how an arrogant self-image can turn into its opposite. Even within our lifetimes. Where have all those great "Western values" gone, with which they have flaunted moral superiority over the rest of the world for decades?
I feel sorry for the ordinary man who simply goes to work every day to feed his family. Everything is happening above his head, orchestrated by the powers that be for whom he toils.