NBC News Reporter Says Haley Is Looking at Total Annihilation on Super Tuesday
The GOP Is Changing, And That’s Good
Nikki Haley Keeps Making a Mess of Things
Don’t Get Too Excited About California’s Senate Race Just Yet
Climate Catastrophizing Finally Backfiring on Radical Environmentalists
Separation of Church and State
Will Nikki Haley Claim The Declaration of Independence, Constitution ‘Not the Same’ and...
GOP Leaders, Listen to Your Base
Conspiracy Theories on the Right are Finally Being Proven True
Airdrops and Elections
I’m Running for Congress to Take Action and Deliver Results
Gavin Newsom Visits the Southern Border After Handing Out Freebies to Illegal Migrants
Ted Cruz: 'Joe Biden Campaigned on Dismantling the Southern Border'
Biden's Sending More Aid to Gaza, but That's Not the Only Issue
Joe Biden's 2024 Chances Look Grim As Trump Tops the Polls

How IBD Accurately Gauged Voter Enthusiasm and Got the Polls Right

Investors Business Daily was one of (very) few pollsters to predict that Donald Trump would be elected our 45th president, despite most polls giving Hillary Clinton the edge. So, how did they do it? Terry Jones, the commentator editor for IBD, spoke with Townhall about their specific methods that set them apart from other polling companies.


“We focus on the data,” Jones said of the IBD/TIPP poll. “The idea is not to get it out quick and dirty but to get it out quick and correct. We worked very hard at that.”

IBD’s pollster, Technometrica President Raghavan Mayur, created the spot-on model that included these key ingredients.

The Key Sample - Likely Voters

Like most pollsters, IBD starts with a random sample of people from the public and base it on census districts.

“The idea is to get as broad and as representative a sample of people as possible,” Jones said.

Then, they adjust for what the census bureau says, accounting for what America looks like based on age, gender, religion, etc.

“We are reasonably sure when we have a population of people we’re talking to that we can make them look like America,” he explained. “You can take 1,000 people and feel fairly confident that the opinions they express will after you’re done performing calculations, will look like America and reflect American opinion.”

The key to accuracy is to not just ask registered voters, but likely voters.

“If someone’s not going to vote there’s no reason to include them in your poll. We identify our likely voters by asking them questions and use data to identify them. There are certain groups of people who are far more likely to vote than others. We winnow out the people who are not likely to vote. Our likely to vote method has worked very well.”

Accurate Ratios

Jones said IBD has a model that comes up with ratios, such as 36 percent Democratic registration, 29-30 Republican, and 34 percent independent or "other." If they use that ratio and a random sample doesn’t come close to those numbers, they adjust their responses to reflect it. Otherwise, he said, "you’re not reflecting the party breakdown."


"Getting that right is crucial," he said. "We use past party registration, but also current registration information. That we can find in various sources. Once that’s done, ask them the questions and then as the poll goes on, we have to make certain decisions."

Gauging Voter Enthusiasm

After the groundwork is done and they find their sample, the IBD pollsters take a close look at the respondents' answers. In particular, they are looking for voter excitement.

“We have to decide whether one side’s voters are more enthused than the other side’s. In some cases, you see there’s a very clear indication in the data that one side is really excited about their candidate and the other side isn’t.”

This method proved accurate in the 2004 presidential election.

"There were a lot of people in the final week who predicted Kerry was going to win," Jones recalled. "But we saw that the data were far more favorable to George W. Bush just based on the enthusiasm of his voters. Sure enough, it turned out Republican turnout was huge. Even though there were only 30 percent of the registrations, they made up 37 percent of the vote."

This year, he said, the trends were similar. A week and a half ago, he noted that there were signs of very strong enthusiasm among Trump voters.

“They were not getting dispirited and in fact they were in greater and greater numbers coming out of the closet,” Jones said. “We saw strengthening of his numbers across the board.”


No doubt a lot of that had to do with Hillary Clinton’s problems over emails, but more of it, Jones noted, had to with solidifying of Trump's growing support in the industrial Midwest.

“Trump crafted a message to reach out to them and we saw it in the numbers,” Jones noted.

Why didn't most of the polls pick up on this voter engagement among Trump supporters? Jones suggests a few other reasons for why IBD’s fellow pollsters fared so poorly.

Back Testing

The IBD poll, he said, which has been in place for four cycles now, is really a "refined model." As soon as the results are in, they begin testing the data to find flaws and study the exit polling.

They also change with the times. For example, in selecting their respondents, they used to rely mostly on landlines. Yet, they now contact 65 percent of respondents via their cell phones.

Changes like those are "advantageous," he says, because it makes their model more accurate. 

Jones emphasized that there’s a bit of a "public relations crisis" for the polling industry. 

That has a direct impact on election because it sends a message that she is so far ahead that they don’t need to vote. 

A PR Crisis 

"When you’re extrapolating from a 1,000-person sample, it’s very important to get the little things right," Jones said. "If you don’t get people that are really likely to vote, you insert some error in there. They’re not quite as persnickety as they might be with those kinds of things." 


In some pollsters' cases, he said, it’s not really their fault.

“One of the things that’s happened to the polling industry is a lot of it used to be driven by newspapers and media spending on polls,” he explained. “It was a prestige project. Since it’s undergone a two-decade long recession, that’s gone. Budget for polling is slashed. A lot of pollsters are operating on really thin resources.”

These factors result in poor quality polling.

Other polls, he said, are just plain biased. There are several he looked at throughout the 2016 election that he suspected might not be on the “up and up.”

A few, he noted, had Clinton up 10 percentage points with week and a half to go. Even Obama, who was one of the most popular presidential candidates in recent memory and led in both elections, didn’t come to a double digit-lead over his opponents. This disparity "should have raised red flags," he said.

As for that week after the FBI revelation in which Clinton’s lead was cut in half in just one week, he suggested that was near impossible: "The dispersion was way too great."

That’s why Jones “takes pride” in the integrity of IBD’s pollster. He lets the numbers tell the story, regardless of political party.

They had Hillary Clinton up for a good while and got angry letters, he explained. Vice versa when Trump got the bump.

For over 50 years, he said, IBD has been built entirely on data and data analysis. It's a "realistic appraisal of what people believe." 


"It’s not a forecast, it’s not an estimate," he added. "We don’t pretend it is one. Today’s poll reflects current public opinion. It’s incumbent upon you to make sure you have the right number for each day."

Mayur called him last week as they were getting "frantic," Jones said, because all the other pollsters had Clinton up substantially. Mayur asked how he was doing and he said he was "tired of getting beaten up." As for Mayur, he was perfectly content. 

"He said, ‘I’m perfectly fine. I’m 100 percent confident in the numbers. I’ve done everything I have to do. My model is in place. We’ve collected the data. I trust the numbers.'"

He had a reason to.

Join the conversation as a VIP Member


Trending on Townhall Videos