Colonizers (the West) call themselves 'civilized' after wiping out real cultures and deep-rooted civilizations.
Discussion
If you belive that framing i have bad news for you about those 'real cultures and deep-rooted civilisation s."...
The West is made up of a great many cultural backgrounds, few of which were colonizers, most were immigrants, some are civilized.