I'm sorry. Just the emotions that follow after realizing people believe the homosexes and the liberals are conspiring to teach sex to your kids. Disney has been doing that for much longer and with much more subtlety. I can name plenty of other places your kids are getting it as well. Anyone have a young one who absolutely adores that new pop diva? Anyone teach your kid that his genitals is a sin? That his body is disgusting? I just think this whole thing is misdirected and wrong. It's all part of a bigger picture that no one is going to see by simply skimming through different topics. There's always a source to any problem and you all seem as far away from it as humanly possible. Hope that clarifies this. Probably not.