I wish someone could explain to me in non-racist terms why South America is not part of "the west".
Western empires have drained capital and cheap labor out of South America for centuries, not the other way around
Please Login to reply.
culturally south america is "the west".
What is your definition of "Western"?
"Once upon a time in the West"