To start off, I am a twenty-year-old liberal college student, so if you find that appalling then you might want to stop reading here.
Ever since I was comfortable enough with my body to wear a bikini, an entirely new and empowered young woman was born. Then crop tops came into style, then bralettes as tops, then tube tops even came back as a trend, and women showing skin was slowly starting to be celebrated by society as it rightfully should. I remember being in elementary school pouting about the fact that I couldn't wear shorts to school on a hot summer's day. I remember the reason being that it was "inappropriate" or "distracting" or some absurd reason of the sorts. But then as I've gotten older I've realized that society was merely sexualizing young children—that's almost as bad as James Charles preying on straight men if not worse.
I remember getting into middle school and high school where I thought that the rule made maybe a little more sense? I had a better grasp on what exactly sexualizing even was and the oppression had been slightly justified, but I never really accepted the reality behind it. Heck, I couldn't even show my shoulders in school because it would apparently give my male teacher a boner? ... Yikes. That doesn't sound like a me problem. That sounds like a men problem.
Right now, I'm in college. And showcasing my body is my favorite thing to do. Wearing a tube top and skinny jeans, maybe a short low-cut dress with heels, or even a leather bodysuit with a short skirt really does something to empower me, and I know it does the same for many of my friends. Going out, showing off your body, being proud of your body and who you are, that takes a lot of confidence and courage, and for men and disloyal women to tell us that we can't do that is almost dehumanizing.
I read another Odyssey article about why we should basically boycott cheeky bikini bottoms and I haven't been that upset in a while. We should embrace cheeky bikini bottoms and embrace them to their fullest. In Europe, men wearing speedos is quite common. Heck, nude beaches are even kind of common too. So for another women to tell me that I shouldn't wear a skimpy bathing suit is not only internalized sexism but it's also just complete BS.
Way back when in the "olden days", women couldn't even show their ankles. The thought of a woman showing her ankles was absurd. Yet with time, society progressed and realized that ankles... they aren't all that bad. Who knew? And then they realized that shins, hey, shins... they're not all that bad either. And then it went on to knees and thighs and hips and midriff, you get the gist. So if I want to show a little bit of my butt on a hot day at the beach, you best believe that if the suns out my buns will be out because that's life and I am a woman and I am empowered—and I'm not going to let sexist men and women take that away from me or shame me for being unapologetically me.
I could go on and on about this. How society needs to stop blaming sexual victims for showing some skin and claiming that showing your legs or a little bit of cleavage on a night out means that you want to take it hard and fast—even if you say "no". Clothes are a way to express ourselves. You can wear a t-shirt that supports your favorite band, or a goth can wear all black, or a frat boy can wear his Patagonia—so why can't I, a twenty-year-old liberal college student wear my cheeky bikini bottoms?
Just a question for thought.