Would you agree that the United States has become more of a democracy over it's history as the right to vote has been expanded, the 17th Amendment was added which makes Senators elected by "the People" instead of the state legislatures, and people have discovered that they can vote money out of other people's pockets to put into their own?