Righty3 Wrote:
Jan 21, 2013 11:10 AM
What happened to America? Simple. Good people with traditional values did nothing as the Left took over the government bureaucracies, media, schools and courts. Five decades later, we're not a center-right country anymore - the underclass has taken over, and the rest of us are slaves to their majoritarian ways. They want what you have, and Obama shows them how to take it from you...welcome to the "fundamental transformation". Get used to it.