Tipsheet

Twisting the Internal Polling Knife

In light of the revelation that Mitt Romney was "shell-shocked" by his loss last week, I've been pretty tough on the job performance of his campaign's internal pollsters, who clearly missed the mark -- resulting in costly tactical decisions down the stretch:
 

These analyses [of the "expand the map" strategy] make sense, but only within the context of the campaign truly believing that they were safe in other crucial must-have states -- a cataclysmically wrong assumption. When I stopped by Romney headquarters in Boston back in September, Newhouse said his team was anticipating a D+3 electorate in November. This seemed entirely reasonable to me, based on evidence from 2004, 2008 and 2010, but it turned out to be incorrect. The actual electorate this year was D+6. Post-election news reports reveal that Mitt Romney was "shell-shocked" by his loss, an outcome that can only be explained by shockingly flawed internal polling. Was that polling predicated on a D+3 model? If so, that would explain the huge disconnect between Boston's expectations and the final results. I'll reiterate that although the D+3 model seemed sensible on its face, it was the campaign pollsters' job to figure out if their assumptions comported with reality. In retrospect, their failure to do so looms very, very large.  


As if to pour salt in the Romney campaign's gaping wound, David Axelrod tells Politico today that Team Obama's in-house pollster was deadly accurate in his projections:
 

POLITICO: What's the most important tool you had this time that you didn't have in '08?

AXELROD : "We had some solid accomplishments and proof points ... We knew a lot more about the electorate than we did in 2008. We could make much more precise judgments about the attitudes of voters, about what was important to individual voters, about who was likely to participate and who wasn't likely to participate. So we had great confidence in our numbers. I got reports every night -- all the senior people did -- from our analytics guys about where all these battleground states were. And they were remarkably close [to the actual result -- Joel] Benenson's polling, within a tenth of a percentage point in the battleground states. Our individual pollsters in their individual states -- incredibly close. What you want in a campaign is as little surprise as possible. Nothing happened on election night that surprised me -- nothing. Every single domino that turned over was in keeping with the model that our folks had projected."


Moral of the story: When you're working off of rock solid data, unpleasant surprises are far less likely to occur.