After a few weeks spent tracking down and questioning pollsters and the reporters of polls, I can assure the reader that pollsters are the modern-day alchemists. They promise to turn numbers into predictive gold. We'd all like to believe these magical powers exist, but we shouldn't. The pollsters of 2012 just don't know who is going to win in November any more than did the pollsters of 1980 know that Ronald Reagan was headed towards a landslide in that late-breaking year.
I'd like to believe Scott Rasmussen that the race between Mitt Romney and Barack Obama is tied. Democrats would like to believe Quinnipiac (used by the New York Times and CBS) or Marist (used by the Wall Street Journal and NBC) that Obama has surged to a lead in Ohio and other key battleground states. They'd also like to believe that Gallup's finding that the president has a six-point lead among registered voters means a six point win in five weeks.
But none of these beliefs are good journalistic practice.
(Gallup's tracking poll changes to a "likely voter screen" next week, and then it will make the most sense to average Rasmussen and Gallup and conclude that is the true "state of the race," though even that average is still subject to incredible volatility in the closing five weeks of the campaign.)
There are plenty of data points to encourage Republicans, and these are genuine data points as opposed to the junk food offered up by Quinnipiac and Marist, which derived their predictions from samples that included enormous Democratic voter margins in key states, pro-Democratic turnout margins that were even greater than those achieved in Obama's blowout year of 2008..
Two data points that warm GOP hearts and undermine the junk polls: (1) Absentee requests in Ohio by Democrats are trailing their 2008 totals --often by a lot in key Democratic counties like Cuyahoga County; and (2) overall voter registeration for Democrats in the Buckeye State is down dramatically from 2008.
These two bits of info undermine the credibility of the Obama booster polls, as did the interviews I conducted with key leadership from both polls and with other informed observers.
The data on absentees and registration point to a fall off in enthusiasm for the president from the highs of 2008, a result both of his epic failures as president and of the fact that the second time around isn't nearly as exciting as casting a vote for the first ever African-American president.
The conversations with the experts are the most revealing, however, and the Manhattan-Beltway media elite has really failed to do even minimal homework here, choosing instead to go with the conclusions of the people they have paid to give them data, thus outsourcing their work.
In this respect big media resembles nothing so much as investors in Bernie Madoff's funds. Madoff never got asked tough questions by his investors or the SEC, and thus he rampaged through his clients' cash.
Big news organizations that turn off their skepticism and write checks to polling firms are doing the same thing and for the same reason: They lack the skills to do their own analyses, and they don't want to be thought stupid for asking obvious questions.
I don't mind admitting I don't know why a sample should include 10% more Democrats than Republicans, so I ask --and ask and ask. This is what journalists do. Unless, apparently, they have paid for the product or want badly to believe it is true.
But don't believe me. Read the interviews.
I also spent a lot of time with the National Journal's Director of Polling Reporting, Steven Shepard, on Septmeber 25, and that transcript is here.
Here are the key takeaways from all these conversations.
The pro-Obama pollsters don't have answers as to why their skewed samples are trustworthy beyond the fact that they think their approach to randomness is a guarantee of fairness, and they seem to resent greatly that the questions are even asked. Like Madoff would have resented questions about his stunning rate of return.
Barone notes that percentage turnout by party in a presidential year hasn't been much greater for the president's party than it was in the preceding off-year, which makes samples outstripping even the 2008 model of Democratic participation "inherently suspicious.".
Cost notes that Romney is winning the independent vote in every poll, which also makes big Obama leads suspect.
And my conversation with Mr. Shepard, whose employer National Journal has a reputation for the best non-partisan work inside the Beltway, didn't find any academic, disinterested support for the proposition that party identification cannot be weighted because of the inherent instability of the marker.
The biggest unanswered question of all: If party ID is so subject to change that it should not be weighted according to an estimate of turnout, why ask about it at all? And if it is for the purpose of detecting big moves, as Mr. Shepard argued, why not report that "big move" in the stories that depend upon the polling?
Over the next few weeks, the junk polls will start tweaking their samples and dancing around that fact in order to come closer to reality. Too late. The sample controversy has penetrated into the public's mind, and Quinnipiac and Marist are marked as either rubes or rogues unless the huge democratic turnout advantage materializes on November 6.
In the meantime, if you see a story on a poll that doesn't tell you the partisan make-up of the sample, note to yourself that the reporter is missing the most interesting bit of data to most readers.
Just as Bernie Madoff used to hide the real story.