It is kind of terrifying realizing that I do not have the right to my own body. Instead, it seems like all of the old, white men in power get to choose what I can and can't do and I think that is straight bull. There is no reason why men should be discussing what women are legally allowed to do with their bodies. I can guarantee that if the tables were turned, men would throw a complete fit if women even thought about controlling their bodies, much less made laws about whether or not a woman can have an abortion.
News Flash for all you men: women are going to get abortions, but by you taking away their right to a safe and legal one endangers their lives. I guess you don't care about endangering their lives though, because some of you want to sentence women to the death penalty for making the extremely difficult decision to terminate their pregnancy.
It doesn't make sense to me why men are even talking about this. This does not affect you. This is not your decision. So why, do they get to be the ones that make it? These laws are a HUGE step back for women and it's about damn time that we stop this.
How about we try this, leave women the hell alone! Let them dress how they want, let them decide if they want to get married or not. Let women decide if they want to be mothers or not. Men get those choices, It's time women get them too. We seem to be making no progress with these men in power, so how about we give women a chance.