Women are not marginalized in the United States of America. What are they fighting for? Why all the anger? I don't know what women are fighting for. I just don't get it. This whole feminist movement thing, it just smells of domineering, desperate estrogen, if I'm being really honest.
Frankly, it is unnatural to try and dominate a man and emasculate him. I think one of the most beautiful things we can do as women is to let a man be a man, and to challenge him to rise to his highest form of masculinity. That is what a man's heart longs for: adventure.