Recently, I've noticed that there are a lot of "I am not/never will be a feminist" sentiments floating around. Maybe it's just because of the culture of the area I live in (rural Tennessee), but a shockingly large number of young women are denying the label of feminist.
I find this extremely confusing.
In all honesty, cultures like this need feminism just as much as those in bigger cities. Society as a whole values and encourages toxic masculinity, but it sometimes seems more extreme in small, rural settings, where femininity is both required and frowned upon. Women are expected to perform traditional gender roles, but at the same time, being "not like other girls" is a popular trend. "Like a girl" is seen as the worst insult you can throw at a boy. "You throw like a girl!" and "Stop crying like a little girl!" are both unfortunately commonly used phrases when "teaching" young boys.
The thing is, many of the women I know who avoid being labelled "feminist" actually support feminist ideas! They want the opportunities to do the same things that their fathers and brothers are able to do! They don't like being told, "you shouldn't do that because you're a woman!" They want to be shown the same amount of respect as any man in their field.
I can honestly say that I am thankful that I finally realized that I do, in fact, need feminism. To me, it's not enough to simply believe I should be seen as equal to my male peers. I need to speak up, and fight for my right to be treated equally. I cannot sit quietly and wait for others to do it for me. I absolutely need to acknowledge the struggles and victories that women before me have faced, and use that to continue the fight for the voiceless and the future generations.
If you are wondering why this is so important to me, why "feminist" is a label I not only wear proudly, but that I absolutely need, then this is for you. Here are eight reasons why I absolutely still need feminism.
1. Because of all the confusion that surrounds Women's History Month
"Why do we need women's history month? What about men's history?" are annoying questions, but they're not quite as concerning as the questions I get when I try to prove why Women's History month is so important. When asked to list important men throughout American history, the list is easily composed. Important women from history, on the other hand, are often overlooked or completely ignored! We need to acknowledge the brilliant, wonderful things the women who came before us accomplished!
2. Because of the wage gap
Yes, this is a real thing, and there are various ways it exists. On average, white women only make about 70-80 cents for every dollar a man makes, and that number keeps dropping when you look at the wages of women of other ethnicities.
Young girls are conditioned at a young age to pursue certain careers, such as teaching or nursing. There is nothing wrong with these jobs, they are very important. However, they are horribly underpaid. Meanwhile, boys are conditioned to aspire for bigger, higher-paid careers in science, business, and technology. It's not unusual for girls who are interested in STEM to have their dreams crushed because they are taught to play dumb, because "boys don't like it when you're smarter than them."
When women make it into male-dominated careers, they often have to work twice as hard (at least) to achieve just as much as their (under-performing) male peers, and they will still often be overlooked for raises and promotions.
3. Because there is a major imbalance in positions of power
In a 2016 article byTime, it was reported that, "women held just 23% of government offices," and " [t]here are four times as many male senators as female ones, and out of the 100 largest cities in the country, just 19 are led by female mayors."
There has never been a female president. Ever. It's been less than one hundred years since we even won the right to vote! And that right is still being challenged today.
4. Because there is a major imbalance in entertainment - both in front of AND behind the camera
I'm embarrassed to admit that I can only think of two major female directors: Ava DuVernay and Patty Jenkins. That's horrifying.
The website Women and Hollywood has a TON of brilliant information on this topic. According to their statistics, women represented only 8% of directors, 10% of writers, and 24% of editors on the top 100 grossing films of 2017, and onscreen they only made up about 34% of all speaking roles in these films - and those numbers get much smaller when you look at different ethnicites. Representation matters, but unfortunately there just isn't enough.
5. Because men are making decisions regarding women's health
In 2017, there were multiple instances where people around the country were outraged by photos that showed rooms full of men making decisions regarding women's health. Policies about women's health are important - lives are at stake - so it's important to have women's perspectives on those issues.
6. Because rape culture is still a thing
We currently have a president who has bragged about sexually assaulting women. In December, a man who has been accused of sexually pursuing and assaulting women who, at the time, were young teenagers, was almost elected Alabama's senator. Popular, GRAMMY-nominated songs have rape-y themes and are celebrated (looking at you, "Blurred Lines"). Rapists are getting away with minimal jail time because they're "talented athletes." Rape culture is very real, and very disgusting.
7. Because "feminine" things are still seen as negative things
Having any sort of interest in clothes or makeup is looked down on or laughed at as being "shallow." "You fight/throw/run like a girl" is a terrible insult for young boys! Stay-home moms are looked at as being "lazy." Because of this, there's a twisted "I'm not like other girls" mentality that many young women adopt as a way to seperate themselves from traditionally feminine things. The truth is, there is nothing wrong with feminine things. As long as you do it because YOU like it, and not because you feel like you're supposed to like it, it's okay!
8. Because so many people still don't understand it
Everytime I see another "I don't need feminism" or "I'm a woman but I'll never be a feminist" post, it makes me even more aware of how much I need it. Being a feminist has opened my eyes to many issues I was previously unaware of, and just because I do not face all of the same oppression and struggles that many other women face does not mean I should not try to fight for those women.