When American parents send their children off to college, the deluge of emotion can be overwhelming. No matter how much time and money you spent preparing yourself for this most pivotal transition in the life of your child, you always feel as though you could have done more. Will they succeed? Will they be safe? Will I get a positive return on my investment in the form of a good job and happy life for my child? What exactly is it that I've spent the last decade saving for? What is a college education, really, and why is it important?
In recent decades, these questions have become increasingly urgent. For all the billions spent on higher education, it seems as though our nation's youth are graduating from college without much in the way of an education. There is a general consensus that there is a dumbing down of America underway. Of course, there are the obvious culprits. Too much sex, too much partying, too many distractions in general. Parents bear a good share of the blame for raising a generation of narcissists who lack the humility and work ethic to succeed. But there's another, more fundamental issue at play when it comes to the obvious shortcomings of higher education in America today. There is a growing recognition that our move away from the classical understanding of what constitutes a proper education -- and the ends of that education -- is largely responsible for the problems we're seeing in the classroom, the workforce, and the culture at large.
NBC Sued For Libel And Slander After Comparing Tannerite Target Company to Terrorists, Killing Americans | Katie Pavlich