As someone from a third world country, I used to think the western world is a shining beacon of light, especially when it comes to free speech and women's rights. I dreamt of the day I would live in America and brand myself a liberal and a feminist, because for me people who pursued liberty and equality for both genders (yes I'm going to only talk about the majorly accepted two genders) were definitely liberals.
Then I entered college and got my first social education on how American women are avoiding feminism like the plague, and things such as 'trigger warnings' and 'safe spaces' are the new goals on college campuses.
What the actual f*** is going on??
Since when did feminism become man hating and all about encouraging misandry? And since when are differing opinions under the right to free speech start being labeled as narrow minded, bigoted, and racist? Why is this happening?
I know there is a long history spanning from the 90's that will answer most of these questions. But for what it is worth - I am embarrassed that as a foreign born and bred millennial I have to watch some of my peers become more and more fragile, paranoid, hapless, and devoid of entertaining polarity in the social conversations.
All this because somehow men are evil by nature? And opinions that differ from the mainstream media based norm hurt feelings?
Feminism has become cancer in America today, and I am saddened that I am finding more reasons to not call myself a feminist. As far is free speech is concerned, the fight is far from over. I still have to look over my shoulder and sense what my crowd is before I give my most raw and honest opinions; the same precautions that citizens in most third world countries take to not get persecuted, jailed, or killed for expressing themselves. Wake up America.





















