Feminism. The Marriam-Webster dictionary defines it as "the theory of political, economic, and social equality of the sexes", or more simply, "the belief that men and women should have equal rights and opportunities". If that is the case, why is it that today the thought of being a feminist seems like a label no one would ever want to wear?
The other day I was talking with some guys and one mentioned that if I was a feminist, I would have been calling them out and getting upset over some of the things they were saying. The word feminist was spoken with a tone of negativity, and I couldn't help but wonder why is this still the case?
I am a feminist. And you probably are too. But because of radical feminists who's stories are always covered by the media, many people are quick to associate feminists as man-haters who burn bras, free bleed, and walk around public places topless because #freethenip (but to the feminists who do participate in these activities, keep on doing you, you are getting the word out). Judging a whole population by a select few radicals is nonsense. It would be as if I just assumed that every white southerner was a part of the KKK, or that every Muslim is a terrorist, which is just ridiculous. It is the same darn thing.
Now lets set the record straight. You are a feminist if you think that it is ridiculous that in the United States women make 79 cents for every dollar a male makes. You are a feminist if you feel that women are portrayed in advertisements far too often as pure sex appeal. If you think that women who are forced into marriages during childhood should have the right to choose a partner for themselves when they feel ready, you are a feminist. If you think women have the capabilities to do anything a man can do, and even do it better, then you're a feminist. You can be a stay-at-home mom and be a feminist. You can be a woman who is the breadwinner of the family and be a feminist. You can be a man and a feminist. Honestly, anyone can be a feminist.
We are not here trying to blame men for these problems that exist in our world, nor do we hate men because these differences exist. But for so long, these differences have gone overlooked by society, and we need to bring them front and center through education regarding these issues as well as promoting women's rights. It is not excessive, and we are not trying to shove our agenda down your throat. We are all human, and all deserve equal rights. We can even be on the same team here if we work together.
Thankfully, women have come so far and gained so many rights and freedoms in the past few decades, especially here in the United States where we are so much better off than women in other parts of the world, due to advocates who made change happen. We would have never seen the day where women could vote or even hold the same occupations as men without the feminists that came before us. If we want to make any more strides towards equality, we must first get behind the idea that women and men are still treated unequally, and that it is up to us if we want to fix it.
I used to be reluctant to admit that I was a feminist because I was afraid of the stigma attached to the label. But honestly, I'm not so scared anymore. I am a feminist and it is about time that we stop being so scared. Girls just want to have fundamental rights, but a little fun is always good too.





















