Let me move straight to the point: the modern feminist movement in the United States needs to reconsider its priorities and activities in the Western world. Please don't misunderstand where this is headed — feminists have plenty of justifiable complaints about the treatment of women in many parts of the non-Western world. And even within the First World, a number of legitimate problems exist that would be worthy pursuits of all people (i.e. equal pay for equal work and the stoppage of human trafficking). Instead, feminists in this nation have misallocated their resources to focus on lesser issues that cause more division among society. Feminists have actually lowered themselves to the level of the same men they have demonized.
The definition of feminism brings about no controversy, and is, in itself, what society ought to be working towards: the belief that men and women should have equal rights and opportunities. I don't know of any sane person who doesn't believe that women shouldn't have the same opportunities as men. But, modern feminists have done much to lose any credibility obtained by their predecessors.
Feminists often dismiss critiques levied against them, particularly if they are from men. Of course men will not inherently understand everything that a woman experiences because indeed the two genders are not equal. There are differences that cannot be reconciled. That same logic applies to women in their understanding of men. This notion does not preclude women from bringing legitimate critiques against the men of American society. In fact, many of the equal protections women have came about because men were willing to accept the fact that they were indeed wrong.
Feminists have also become very adept in pigeonholing all men as oppressive in some fashion, as if men are part of the 'system' and can exert no will of their own. Feminists have become a victim of a sort of mania, seeing visions of sexism where none actually exists. If a man interrupts you while you're talking, it may not be due to the fact that you're a woman, it might simply be that one particular man is a jerk to everyone, regardless of gender. If a man receives a promotion over a woman, can it automatically be attributed to sexism? Feminists too often make a conclusion that fits with a limited set of facts.
Our society doesn't condemn a group, writ large, for the mistakes and misgivings of a few people from that group. Would feminists find it acceptable for a person to characterize all Muslims because the perpetrators of the 9/11 attacks were Muslim? Better yet, would a feminist deem it permissible to base beliefs about women on the actions of a few? Whether it be men, women, Muslims, Christians, blacks, whites, or any group, generating a stereotype adds nothing to the conversation about equality of opportunity.
Feminism has also served to denigrate the status of women in American society. Though unintentional, feminists have triggered a change in the social norms that has taught women how to engage in the same behaviors for which they have critiqued men. Women now have a mindset that says the objectification of people (whether men or other women) is an acceptable behavior.
In business or politics, women have become just as ruthless as men in their callous treatment of others so that they can climb the corporate ladder. Achievement has become the focus of life. With respect to the issue of sex, women were right to call out men on their objectification of women. Years ago, women were far less likely to abuse alcohol, and scolded men for doing so. If a person enjoys their work, alcohol, the 'hookup' culture, or any facet of life, then that's their prerogative. However, it is hypocritical for feminists to have deemed this as unacceptable behavior for men, but now find it permissible because women can engage in those behaviors too. Maybe achieving certain levels of equality for women actually made them what they were annoyed with in the first place.
The hypocrisy prevalent in radical feminists further plays into the hands of their critics because of how they view males. British sociologist and feminist Ann Oakley wrote, "Men are the enemies of women. Promising sublime intimacy, unequalled passion, amazing security and grace, they nevertheless exploit and injure in a myriad subtle ways." American feminist Mary Daly voiced her opinion of men, stating, "If life is to survive on this planet, there must be a decontamination of the Earth. I think this will be accompanied by an evolutionary process that will result in a drastic reduction of the population of males." When feminists view men with such disdain, how can anyone be expected to have a dialogue about the real problems of the world? Feminists seem to have taken on another terrible quality of which they claim to despise. They believe in their own superiority over men.
Feminists have also remained silent about what a woman ought to do when she is the beneficiary of a double standard. Men are required to sign up for the Selective Service, so that they can be called upon for military service if ever necessary. Why don't feminists seek to create equality of opportunity in that field? Also, I haven't heard any feminists complain about the unequal application of capital punishment in America. Men are executed at a grossly disproportional rate and we coddle women who commit heinous crimes on the grounds that we are compassionate towards the 'fairer sex.' Women who become pregnant have the 'right' to an abortion without the consent of the biological father. The irony of this belief is that by justifying pregnancy as a situation unique to women, feminists are also admitting that inherent differences between women and men do exist and cannot be overcome. Why doesn't a father have the right to stop an abortion? Should he not have an equal say in the fate of what happens to a child he helped create? Feminists want equality of opportunity — so long as it serves their agenda.
Finally, feminists have taken the approach that anyone who challenges their perspectives on gender is obviously sexist and incorrect. They now believe they're the only enlightened, intelligent people with anything to say and marginalizing the women who do not fit into their expectations. Feminists have no answer for the fact that such a large number of women still enjoy and seek out the traditional roles females have fulfilled in society. Are millions of women somehow being deceived and conditioned to accept certain roles in society? Or is it more likely that the arrogance and small mindedness has blinded feminists to even the hint that they might be wrong?