Men have always given women their liberties, the world has always been a Patriarchy, always will be.
Discussion
No, men have given some of their rights to women only in the last hundred years, before that women's role was only childbearing and homemaking.