There is a rising sentiment among both women and men I know (I work with 85% female, 15% male) that men just aren't men anymore. It's not about being brutish, but confident. Politics has no place in it, either. Guys should know how to fix things, make things, protect and provide. Doesn't mean they have to be hunters, but they should be willing to learn. And it isn't about "putting a woman in her place". Many of the women I work with make more money than their husbands. A couple even have stay-at-home spouses. The women I talk to when it comes up say they like it when their husbands/boyfriends show confidence but not arrogance. To that end, I discovered this
blog/website.
So I'm curious what folks think. Is there, in fact, an emasculation of the American Male? If you're not American, do you notice it where you're from? I'm also interested in the thoughts of the females, regardless of country of origin.