This website is no longer actively maintained
Some material and features may be unavailable

Nate Silver on why the polls don’t always add up

With the midterm elections just over two weeks away, those who consider themselves political junkies can’t help but check the latest polls every morning, try though we might to resist the urge.

Interpreting polls can be tricky, sometimes impossible. What do you make of one poll where a candidate is down by six points, and another that has that same candidate down by more than 30? How do you weigh the differences between likely voters and registered voters? Here’s the simplest answer to these and almost all other poll questions: You ask Nate Silver.

Nate Silver is the editor of the much-read, highly respected political blog, which you can now find at The New York Times website. Nate Silver is a man who knows his numbers — starting with baseball stats. He correctly predicted that the White Sox would lose exactly 90 games back  in 2007. He then moved on to politics — and came within one point of predicting Barack Obama’s popular vote victory. He joined Need to Know to talk about polls — and why the numbers don’t always add up.


  • Anonymous

    Well first of all I remain unconvinced that a poll of 1000 to say 1500 people can say anything about National Political opinions in the USA. Second, how on earth one can truly differentiate between a likely voter and registered voters (by phone) defies my albeit limited knowledge of statistics has having any validity within the margins of error they quote and they give the numbers in each category.
    Then you actually look at the PDF files of questions and percentages and its clear in certain cases that the people paying the pollsters are either fundamentally Democratic or Republican! If you try you ( you do not have to be a genius) you can see because of how the question(s) are framed.
    I have even looked at PDF’s on polls supposedly on Obamas national levels of performance based on two states results and overall 2,100 people questioned. sorry guys the statistical error levels of 3-4 % are just not real. I have conducted marketing surveys for product and by mere infections of grammar in the question can obtain widely varying results for the same product in the same state, and within the parameters of the brief can say its accurate to within 3-4 % subject to further face to face research. to refine strategies.
    Lies damn lies and statistics I am afraid… depends on to whom, what, why, and how the questions are asked. Even the order in which questions are asked can change the result.
    Guest 5.

  • Janet_gulbransen

    polls can be so misleading.. i have one voter out here friend of mine…. first time voting for Obama in her 43 years and no one is calling her.. and she is voting in the mid term election.. for democrat.. did you count her?

  • Anonymous

    1) note how the q’s are worded. different words appeal to different groups.
    2) what is the purpose of the poll?
    3) who is behind the authorship of the poll?

    one thing years of probability and statistics taught me, a poll is only as good as the one that authors it.

  • Worker 13

    I would like to see the reult of a poll which asked: Would you like to see us provide the opportunity for all Americans to buy into Medicare?
    Also: Do you honestly understand the “war on terror”?
    Also: Do you believe the current U.S. policies are encouraging or discouraging terrorist attacks in our country or elsewhere?
    Also : Would you support reforming our trade policy to keep manufacturing jobs in the U.S.?

  • Worker 13

    It would be interesting to see the results of these for questions asked in a rotating order. None of the questions are of the type that you are likely to see on ANY poll.

  • SleepsWithCats

    Re the question of determining who is a “likely voter”, check out

  • David Jerrard Givens

    Aggregation of polls has shown more effective. , Taegan Goddard’s Political Wire has been excellent in recent years with the practice. It is true that many polls are biased, but you again enter a bell curve of possibilities when it comes to the polls (some leaning left, some leaning right), the numbers begin to correlate better. It is also a larger population sampling.