More tall stories from, "Obama-land!" Isn't it wonderful to live in a world that, good is bad and evil is anything that the leader say's, it is? How did we get here? Are there actual[y people, that believe whatever Obama tells them, is the truth? When did Americans start falling for such apparent lies? Have these , "Americans," been there all the while, did they want to become, Socialists and discard, Capitalism? Who convinced them that was the better way of life?