Health Politics Post sex

Sex, Sex, Sex. Get over it.

Fellow interneters, imagine with me if you will: You are in a group, either that you know or you don't, it doesn't matter. The subject turns from something innocent about the upcoming elections to, dare I say it, sex. Not just 'hey, that's sexy' but 'oh my god, there is sex everywhere and my/someone's children could possibly hear about where babies really come from and now I am super offended'. Now, everyone is in a giant debate over their personal views of sex. Some are offended and leave, some yell and scream and don't listen to what anyone else has to say, and some never get their point across. . So, now you have this visual. Someone please tell me why it's such a hot topic? Personally, I think that we are way to hung up on it. We can't have our children finding out about sex! It's immoral. It's disgusting and dirty and better just not to think about. The question I pose to you is: Why are we still like this? If we just taught our children about sex, then it wouldn't be a big deal. I know it's cliche but knowledge is power. I feel like we shelter our children from way to much. Bike helmet obsessions, not allowing children to play outside alone, fear of generally everything and an overwhelming fear of our children seeing a naked breast, or godforbid, a penis; when are we going to stop bubble wrapping our children? Americans are afraid of sex. Why is that? Why is America so hung up on it's own fear of sexuality that we have to persecute Janet Jackson for a slip of a pasty covered nipple at the Superbowl for years to come? Why is that more important then any other topic, just about? If you want to have the American public's attention turned away from important matters, why just say something that involves children and sex. That'll start some new fires and leave your fraud/political mishap/whatever bad thing you were doing to die in peace with out the American public even knowing it was there. American priorities are in shambles. Who cares if there is nudity? Some people used to think that the human body was beautiful, not shameful. When did my vagina become shameful? Why are we teaching our children to fear their own bodies? It's not just Hollywood that teachs girls to hate their own bodies. It starts with their own mothers and fathers telling them about that special private area, you know, the one that's dirty and ugly and never to be looked at much less touched unnesessarily. Hmm... And so, this overwhelming fear of our own sexuality is spread like an insecure disease to our children in an unending cycle of self hatred and obsessive insecurity with the topic of sex or sexuality. How can we change this?