"Feminism" is a loaded word in the United States because it carries so many controversial connotations. Professional feminists often insist that they have a monopoly on the word and its meaning, which forces lots of people to reject the label. Conservatives are the most obvious example of that, but many young people, including very "liberated" young women, avoid the term because they think it means rejecting any traditional understanding of motherhood, courtship, etc.
But if you can lay aside all of those worthwhile arguments about Western society for a minute, the simple fact is that "the feminists" are absolutely right when it comes to the treatment of women in much of the developing world. If women were seen as a religious or racial minority, this would be glaringly obvious. Imagine if a white country refused to let blacks learn to read, never mind go to school or even go outside. I don't know a social conservative -- and I know many -- who doesn't agree with radical feminists when it comes to recognizing the barbarity of female circumcision, wife-burning, breast-ironing and the rest.
Forgetting the question of decency and morality for a moment, there's the matter of national interests. Female equality seems to be a pretty reliable treatment for many of the world's worst pathologies. Population growth in the Third World tends to go down as female literacy goes up. Indeed, female empowerment might be the single best weapon in the "root causes" arsenal in the war on terror.
The reason strikes me as fairly simple. Women civilize men. As a general rule, men will only be as civilized as female expectations and demands will allow. "Liberate" men from those expectations, and "Lord of the Flies" logic kicks in. Liberate women from this barbarism, and male decency will soon follow.
Despite Gun Sales Being Banned in Chicago, Police Superintendent Still Blaming Lack of Gun Control For Violence | Katie Pavlich