Is it weird if I don’t feel like I fit in the western society anymore?
I feel like the west is in decay and there are no principles and standard anymore because we let go faith and religion and pushed wokeism, mass migration and political correctness.
To be honest, my goal is to become financially independent and travel to find a place where I feel at the right place