By definition, feminism is the advocacy for women's rights on the basis of the equality of the sexes.
Many people, as well as myself, feel that women can be (and are) discriminated against or receive lower pay because they are women, even if this is not intentional. However, I think that feminism has moved away from helping women be better in the workforce. Much of the feminism that I have seen more recently spends more time man bashing. What I mean by this is that rather than women spending time trying to make themselves better, they drag down men.
I feel like as a woman, I and many others fall under the feminist umbrella just because we are women. But, I don't particularly agree with this idea because many of the things that feminists support and do, I do not agree with. Feminism is no longer about equality. I find that many times, feminism is used as a tool for women to try and say that they are better than men. Men have problems too and they can't just be thrown under the bus or dismissed because they are men. Along with that, women talking down men is not a great idea, either. I think that there should be a balance between them as opposed to one gender trying to overtake the other.
The many women's marches that have taken place across the country are just one example of how discombobulated feminism can be. Many people who attended the marches were holding signs relating to the LGBT community, among other things. These causes are not related to the progress of women in the workplace or being treated as an equal to men. It was more like a big temper tantrum by some of the women of America who didn't get what they wanted. Just because you don't like the results of an election, doesn't mean you should go out and complain about it because that isn't going to make a difference when the votes have already been counted.
The United States allows people to vote freely for who they would like for president, and President Trump was America's choice. Although many people do not like President Trump, he is the president whether you like it or not. There are far larger issues in the world that need attention than women upset that they didn't get their way.
Lastly, becoming a feminist has become a popular trend for celebrities like Taylor Swift and Katy Perry. Not to discredit their intentions, however, but in considering themselves feminists, many of their fans will follow suit and say they are feminists, too. The problem with this isn't the celebrities, but rather the fans who mistreat the term "feminist" and use it as a way to make women look better than men.