After noticing an abundance of comments that worried me coming from friends in such a short amount of time, I decided to truly take notice every time one of my friends or family members says something sexist. As someone who used to curse the name of feminism as a teenager (because I truly had no idea what it was about) I took it upon myself a few years ago to see what the fuss was about. It wasn't until I realized that yes, there is a problem with the way our society views women. We have the same rights — we know that. Yet there's still this divide between the sexes. Now, as a 21-year-old woman and self-proclaimed feminist, I take notice in those small comments. For one week I listened hard to what my friends were saying and this is what I heard.
1. "Women can't drive"
This is a hasty overgeneralization. Every single woman can't drive. This comment came from someone who rear-ended an elderly woman because he was texting and driving.
2. "Men were born with an advantage over women"
What exactly is that advantage? I asked my friend what he meant when he said that, and his response was that he didn't know exactly what the advantage was but that it was there.
3. "Feminism killed beauty"
This was a video a friend told me to watch that he was laughing at because of how "accurate" it was. It was pictures of celebrities before they publicly supported feminism vs. after. All of these women looked beautiful to me in the before and after pictures, but the people in the comments liked to believe that feminism made women ugly. This one shook me to my core.
4. "Girls are so fake"
Again, this is just a stereotype and a generalization. I will say to you what I said to this person who uttered this: women aren't fake, we're sneaky, cunning, and smart.
5. "I tried to watch 'Orange is the New Black' but it's too much about feminism"
So, when a show comes out with an almost all-female cast, it's labeled as a feminist show. The underlying theme of feminism is nowhere in this show, yet because the cast is mostly badass women, some people think it's about making some political statement. Why can't an all-female cast be seen as normal?
These were the things I heard in one week from friends. Friends who are all intelligent, kind people. Friends who I know didn't mean to offend me. Friends who don't even realize what they're saying because that's how they were raised. Sexism is deeply rooted in our society-- so much so that half of us can't even recognize it. Yes, women have the same rights as men; that's not what it's about. It's about the fact that women are, and have always been, treated and seen differently simply because of their gender. If you can tell me that you've never been affected by or said/done something sexist, you need to become more aware of your surroundings, because it's all around and it's never cool. Don't deny the obvious-- open your damn eyes.