I am curious about one thing. Men are physically stronger than women are. In the biological, natural world, it is generally the species or the gender that is physically stronger that wins.
How do you expect to change that? Men are not going to become physically weaker, and as a result of biology, they have more testosterone which makes them more violent and aggressive.
This was mother nature ensuring that one of the sexes had the guts to fight threats when they came their way.
Do you want to change that?
In terms of the law, we do not live in a male orientated society. However, nature will always rule. There is absolutely nothing one can do about the fact that men are more dominant than women. That is like that for most species.
Of course, there is one option, but I don't see any women taking it. It's simply to walk away from men and live their own lives. They don't take husbands, lovers, or boyfriends. They open up their own businesses and work with other women. That way, they can be totally free of the influence and power of men.