The data on absentees and registration point to a fall off in enthusiasm for the president from the highs of 2008, a result both of his epic failures as president and of the fact that the second time around isn't nearly as exciting as casting a vote for the first ever African-American president.
The conversations with the experts are the most revealing, however, and the Manhattan-Beltway media elite has really failed to do even minimal homework here, choosing instead to go with the conclusions of the people they have paid to give them data, thus outsourcing their work.
In this respect big media resembles nothing so much as investors in Bernie Madoff's funds. Madoff never got asked tough questions by his investors or the SEC, and thus he rampaged through his clients' cash.
Big news organizations that turn off their skepticism and write checks to polling firms are doing the same thing and for the same reason: They lack the skills to do their own analyses, and they don't want to be thought stupid for asking obvious questions.
I don't mind admitting I don't know why a sample should include 10% more Democrats than Republicans, so I ask --and ask and ask. This is what journalists do. Unless, apparently, they have paid for the product or want badly to believe it is true.
But don't believe me. Read the interviews.
I also spent a lot of time with the National Journal's Director of Polling Reporting, Steven Shepard, on Septmeber 25, and that transcript is here.
Here are the key takeaways from all these conversations.
The pro-Obama pollsters don't have answers as to why their skewed samples are trustworthy beyond the fact that they think their approach to randomness is a guarantee of fairness, and they seem to resent greatly that the questions are even asked. Like Madoff would have resented questions about his stunning rate of return.
Barone notes that percentage turnout by party in a presidential year hasn't been much greater for the president's party than it was in the preceding off-year, which makes samples outstripping even the 2008 model of Democratic participation "inherently suspicious.".
Cost notes that Romney is winning the independent vote in every poll, which also makes big Obama leads suspect.
And my conversation with Mr. Shepard, whose employer National Journal has a reputation for the best non-partisan work inside the Beltway, didn't find any academic, disinterested support for the proposition that party identification cannot be weighted because of the inherent instability of the marker.
The biggest unanswered question of all: If party ID is so subject to change that it should not be weighted according to an estimate of turnout, why ask about it at all? And if it is for the purpose of detecting big moves, as Mr. Shepard argued, why not report that "big move" in the stories that depend upon the polling?
Over the next few weeks, the junk polls will start tweaking their samples and dancing around that fact in order to come closer to reality. Too late. The sample controversy has penetrated into the public's mind, and Quinnipiac and Marist are marked as either rubes or rogues unless the huge democratic turnout advantage materializes on November 6.
In the meantime, if you see a story on a poll that doesn't tell you the partisan make-up of the sample, note to yourself that the reporter is missing the most interesting bit of data to most readers.
Just as Bernie Madoff used to hide the real story.