Men have always given women their liberties, the world has always been a Patriarchy, always will be.

Reply to this note

Please Login to reply.

Discussion

No, men have given some of their rights to women only in the last hundred years, before that women's role was only childbearing and homemaking.