I'd argue that both are ideologically the same. They will discover after college that they were fed a lie, which will have a negative effect on their views of all institutions. Moral leniency will become justifiable. Self respect will deteriorate. They will feel like the world owes them what the institutions promised them, when they don't get it, they will stop trying to impress people and instead fight harder against cultural norms, stereotypes, standards, becoming what feminism is in its most extreme, depraved forms.