Men in America

Something ominous is happening to men in America. Everyone who pays attention knows that. What’s odd is how rarely you hear it publicly acknowledged. Our leaders pledge to create more opportunities for women and girls, whom they imply are failing. Men don’t need help. They’re the patriarchy, they’re fine, more than fine.

But are they fine?

This entry was posted in man-bashing, Men's rights. Bookmark the permalink.

Leave a Reply

Your email address will not be published.