When were they truly conservative in America? When only land owners could vote? Only whites? When you could own slaves? When we slaughtered most of the natives and put the remainder on reservations, scraps of their homeland we took? When we bombed striking workers instead of giving unions rights? When we destroyed Black Wall Street out of spite? When we could still discriminate freely based on race? Or gender? America has been whittling away at these hierarchical power structures, but it's not like it hasn't been an unending struggle to restrict the elite class.
Discussion
I don't need to discuss history.
Over the last 40 years American Right Wing politicians have always called for radical change, and never once promoted any conservative action at all.
What do you think they're trying to conserve, then? The status quo? In that case, the only real conservative policy would be to not pass or repeal any laws. The only time they haven't proposed radical change was when the elite class was sufficiently entrenched. Which would align with my accusation that their only real policy is establishing and supporting a wealthy elite class. Once they started failing, they had to start trying to drag us back, and the farther we progress, the farther they try to drag us back.