Those who want to discredit the United States and to deny our role as history’s most powerful and pre-eminent force for freedom, goodness and human dignity invariably focus on America’s bloody past as a slave-holding nation. Along with the displacement and mistreatment of Native Americans, the enslavement of literally millions of Africans counts as one of our two founding crimes—and an obvious rebuttal to any claims that this Republic truly represents “the land of the free and the home of the brave.” According to America-bashers at home and abroad, open-minded students of our history ought to feel more guilt than pride, and strive for “reparations” or other restitution to overcome the nation’s uniquely cruel, racist and rapacious legacy.
Unfortunately, the current mania for exaggerating America’s culpability for the horrors of slavery bears no more connection to reality than the old, discredited tendency to deny that the U.S. bore any blame at all. No, it’s not true that the “peculiar institution” featured kind-hearted, paternalistic masters and happy, dancing field-hands, any more than it’s true that America displayed unparalleled barbarity or enjoyed disproportionate benefit from kidnapping and exploiting innocent Africans.
An honest and balanced understanding of the position of slavery in the American experience requires a serious attempt to place the institution in historical context and to clear-away some of the common myths and distortions.
Despite Recommendations, Diplomatic Security Levels Still Not Improved Post-Benghazi | Katie Pavlich
Insane: Rich Los Angeles Neighborhoods Vaccinating Kids at Lower Rates Than Poor African Countries | Christine Rousselle