I was making a totally different point, I never questioned the fact America was democraticly elected. I was pointing out the fact that Americans aren't taught we a "federation" like the term "federal government" would imply, we are taught we a Republic which doesn't seem to be true at the moment.

Reply to this note

Please Login to reply.

Discussion

I see. Yeah the USA doesn't really behave like a republic. The Civil War (war between the states) pretty much ended that along with the actions of Presidents and Congress over the years. We still have some aspects of the OG system but not much.