No, I've actually seen it in practice. So many kids of friends, and even my own child to a degree, are going away to college and disavowing everything they've been taught for the last 18 or so years. Even though I believe it's the norm to question a parent's values as one becomes more independent, I've seen kids have buy into the idea that hard work, family, and religion are "old hat" nowadays, and "do what feels good" has taken hold. Feeling that value and purpose of working hard has slowly eroded. I wasn't specifically talking about homosexuality before; I just meant that what kids are being taught is virtually pablum anymore. My own son, who is a business major, is required to take "Diversity Issues in the Workplace."