A better way to have predicted the GOP wave

Interviewing Peterson Institute of International Economics senior fellow Justin Wolfers about October’s unemployment report on Friday, Paul Solman also took the opportunity to query the economist about last week’s midterm elections. For starters, why did so many voters tell exit pollsters the economy is getting worse when economic data, including October’s jobs numbers, continue to suggest a strong recovery?

Justin Wolfers: To be honest, I’m puzzled. Unemployment has come down from double digits to 5.8 percent, employment growth has been extremely strong, there’s no inflation anywhere on the horizon. The budget deficit is back below its 40-year average.

All these numbers seem to suggest the economy’s doing well, and in fact, if you ask people, and look at consumer confidence numbers, they also suggest that people think the economy’s doing fine. But for some reason, though, they’re telling exit polls something altogether different. And not just the exit polls — when it came time to being in the voting booth, they voted that way, too.

As Paul pointed out on Friday’s broadcast, stagnant wages and partialized employment — with even more part-timers than the Bureau of Labor Statistics reveals in its release each month — could have something to do with it.

Aside from exit polling, we asked Wolfers to tell us about how two other forms of electoral predictors — prediction markets and expectation polling — fared last Tuesday.

In 2012, Solman reported on the presidential prediction markets, which showed President Obama to be the strong favorite, while traditional polls put the race with Mitt Romney much closer.

Watch that report below:

As it turns out, the question that prediction markets are based on — who do you think will win — can be adapted for traditional polling, too. Wolfers and co-author David Rothschild (who appears in the segment above) demonstrated in a 2012 Brookings Institution paper, published just before the election, that asking voters about their expectations (“Who do you think will win?”) versus their intentions (“Whom do you plan to vote for?”) more accurately predicted the results of presidential elections for the previous 60 years.

The response rate to polls has been declining over the years (Pew showed 2012’s response rate to be just 9 percent), but one of the advantages of expectation polling is that asking voters who they think will win effectively broadens the sample size. “Voters respond as if they had polled twenty of their friends,” the authors wrote. (Read more about their study, and the political context in 2012, from David Leonhardt in the New York Times.)

So what about midterm elections? The results of Wolfers’ and Rothschilds’ study held up in 2014, too. As Wolfers wrote in his Upshot column last week, expectation polling conducted by The New York Times/CBS News/YouGov accurately predicted the winner in every Senate race except North Carolina’s. That’s a much more successful track record than most surveys of voters’ intentions. Likewise, in gubernatorial races, polls of voters’ expectations missed half as many races as polls of voters’ intentions.

Below, Wolfers tells Solman why prediction markets and expectation polling are the winners of 2014.

Paul Solman: You and I have talked about this for a long time — that we both believe in prediction markets. How did the prediction markets do in this last election?

Justin Wolfers: So pretty much everyone said that the Republicans would take the Senate. So the question is, who predicted it first and who predicted it most confidently? And it turns out that prediction markets moved very strongly toward Republicans a couple weeks ago, and more strongly than most of the pollsters were suggesting. So I’m going to call this one another win for prediction markets.

Paul Solman: Any insights about the deficiencies of polling?

Justin Wolfers: Yeah, so there’s actually two questions that pollsters could ask. Most pollsters go out and say, “Whom do you intend to vote for?” That’s what we’re used to. They could also ask, “Who do you think will win?” It might be a better idea because when I ask you who do you think will win, you think not just about yourself, but you’re going to tell me how your wife is going to vote, the folks you meet around the water cooler, the yard signs you’re seeing in your local area.

It turns out that if you look at that question — who do you think will win — it got every single Senate race right except North Carolina. By contrast, FiveThirtyEight got three of them wrong.

Paul Solman: That’s Nate Silver’s site aggregating all the polls.

Justin Wolfers: Absolutely, he’s as good as polling can get. But simply asking a different question did even better.

When it comes to the governors’ races, Nate got five of them wrong, but if he simply looked at my question — who do you think will win — you’d have gotten only three wrong. So it was a very good election for this alternative approach to forecasting where you just ask who do you think will win.

Support PBS NewsHour: