Hollywood glorified sexual violence, pedophilia and untraditional values. Many people seemed shocked around the many allegations surrounding Hollywood executives from Bill Cosby to Harvey Weinstein. However this doesn't surprise me at all. We have movies such as "50 Shades of Grey" that glorify sexual abuse, fetishes, and psychological manipulation. Actual things like that movie take place in real life, such as the cult NXIVM or what happened in Epstein's mansion. Hollywood glorifies things like this on screen than people are shocked when it happens in real life.
Hollywood is also known for promoting sexually intense subject matters to a young audience. Shows that were always on Nickelodeon or so called kids channel were filled with the whole "hookup subculture" and sexual innuendos. There has even been theories that Dan Schneider has been involved in pedophilia as well. For the record those are just rumors but have been fueled by pictures of young girls in his laps and the type of shows he has produced.
Regardless, there is an obvious problem in Hollywood. If we want to get rid of this whole disgusting sub culture and stop the hypersexualization of our youth, we need to know we have a problem. That problem can only be fixed if we start taking notice on what have actually been going on. We as a society could also watch and respect much more entertaining things. Shows such as Shark Tank, Mythbusters, Dragon's Den, and Invention USA, I have no problems with. However, shows that end up promoting things that demoralizes society greatly or destigmatizes serious topics are absolutely problematic.