Here's one of the aggravating things about all this... the erroneous belief the founding fathers were deists and not Christians. From the birth of this nation all the way through the 1960s, the Founding Fathers were unequivocally declared to be Christians. Then those who sought to defame this nation and its founding beliefs began to stretch and contort history to suit their agenda. If they weren't Christians, what god did they believe in? If they weren't Christian, why did they carry copies of the Bible and frequently refer to it?