KTWU I've Got Issues
IGI 11
Season 11 Episode 11 | 26m 52sVideo has Closed Captions
WE DISCUSS THE ROLE OF POLLS AND ELECTION SURVEYS IN AMERICAN POLITICS.
WE DISCUSS THE ROLE OF POLLS AND ELECTION SURVEYS IN AMERICAN POLITICS. GUESTS ARE DR. ANN SELZER, PRESIDENT OF SELZER AND COMPANY AND DR. PATRICK MILLER, ASSOCIATE PROFESSOR OF POLITICAL SCIENCE AT THE UNIVERSITY OF KANSAS.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
KTWU I've Got Issues is a local public television program presented by KTWU
KTWU I've Got Issues
IGI 11
Season 11 Episode 11 | 26m 52sVideo has Closed Captions
WE DISCUSS THE ROLE OF POLLS AND ELECTION SURVEYS IN AMERICAN POLITICS. GUESTS ARE DR. ANN SELZER, PRESIDENT OF SELZER AND COMPANY AND DR. PATRICK MILLER, ASSOCIATE PROFESSOR OF POLITICAL SCIENCE AT THE UNIVERSITY OF KANSAS.
Problems playing video? | Closed Captioning Feedback
How to Watch KTWU I've Got Issues
KTWU I've Got Issues is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship- Coming up on IGI, we're honored to have one of the best pollsters in the United States join us right here in the studio to talk about polling and public opinion.
Stay with us.
(calm music) - [Narrator] KNEA, empowering educators, so that educators can empower Kansas students.
- [Announcer] This program is brought to you with support from a Lewis H. Humphreys Charitable Trust, and from the Friends of KTWVU.
(calm music) - Hello and welcome to IGI.
Public opinion is fundamental to democracy.
It informs us about elections and impacts the actions and accountability of our elected leaders.
Yet many Kansans polling is a giant mystery, and they have no idea how it works.
In a more polarized America, how do we get at public opinion?
Joining us now to discuss the role of polls and election surveys in politics are Dr. J. Ann seltzer, President of Seltzer and Company and Dr. Patrick Miller Professor at the University of Kansas.
Thank you so much for joining us for this program.
Before we get started, we always love having Dr. Miller here.
That's great, thanks for joining us.
- And thanks for having me.
- I did wanna point out that it's a real treat to have Ann Seltzer here and 538.com, which is a well-known and a polling site has said that Anne Seltzer is the best pollster in politics.
Pretty good.
And they wrote that her old school rigor is what makes her uncannily accurate.
The New York Times said Dr. Seltzer is the world's best pollster.
So now we're going up from America to the world.
And Politico is saying that she's Iowa's most enduring polLster.
So you got Iowa, you got the America, you got the world, needless to say, she's very accomplished in her field.
And it's just a real treat to have you here.
- Well thank you, thank you.
May it always be so yes.
(speakers laughing) - Yes and polling that's for sure, right?
I did wanna start, obviously we have two incredible experts here.
I wanted to start because you're a Kansas native, you went to Topeka West High School in Topeka, Kansas.
So how did you get from Kansas and then get interested in polling and then get to where you are now?
- Yeah, well, I have a very linear career track, which is unusual in the polling business, but when I went to the University of Kansas, I got interested in collecting data about what people think and went to graduate school at the University of Iowa.
That's how I became an Iowan and spend a year during that time in Washington, as a congressional fellow on Capitol Hill and got the political bug.
So the marriage of data collection and politics leads one to polling.
So I've over 35 years of experience in the polling industry.
- Was there a moment in your younger days where you just, you say you got sort of this bug where you thought, wow, this is really interesting.
- I don't know that it was one moment in particular, but it was sort of that marriage of, well, I think I'm always gonna be a data girl.
I'm gonna always gonna be interested in having something that tells you yay or nay as to whether your theory is right.
And then being able to do that in a political atmosphere.
I do other kinds of polling work, but this is the one that I'm most noted for and is really contingent on having good methods in place.
And good methods apply beyond polling, beyond politics.
- No, obviously campaigns have pollsters.
What made you decide not to go into the sort of the elections and the campaigns and the candidate and work for a candidate or a senator or something and go into sort of the private side?
- Well, in some ways it's the private side but in some ways it's actually the public side.
So the work that we do for most notably the Des Moines Register, but I've polled for Bloomberg politics, I polled for other newspapers, is made public.
And so the accountability factor of whether your numbers turn out to be right or wrong is very public, plays out in the public.
And that decision was really driven by having the opportunity to join the staff of the Des Moines Register, which at that time was one of the largest circulation newspapers and also had its own internal polling operation.
It was started by George Gallop.
And so since 1943 had been doing that, and it gave me a chance to really learn the business without having to go get clients.
And then when I left, I could go get clients and do it, but that part's really sort of stuck.
- Now, Dr. Miller, you teach polling.
So you went into that side.
How did you get interested in it?
- Oh boy, well, when I arrived at college in the late 90s, I went to college in Virginia where I'm from William and Mary.
I thought I was going to be a psychology and political science double major.
And it hadn't really occurred to me at the time that those two things actually could overlap in a way until I took a course in survey research with Ronald Rapahor, who was my undergraduate advisor and I was just completely fascinated with that.
And then later when I came back to graduate school, I picked grad school, UNC Chapel Hill, that really specialized in what we call political behavior.
I got a parallel graduate certificate in survey research helped set up a survey research center at Duke, but my interests have always been more in data collection, research of which polling can be a part of that and I definitely loved that the polling process and gathering data, but I think my passion has always been more about doing that for social science research purposes, than necessarily more commercial or campaign purposes.
- Now I teach a American politics and I teach about elections, but I'll admit when it comes to a survey research and polling, I do try to bring Dr. Miller in 'cause I'm not very adequate, but I do encounter in terms of polling and survey research, sort of surprisingly in 2021 people that seem to have attitudes about it and seem to, you know, know nothing about it.
So they people can, will tell me, well, those polls are never right.
And I of course also hear, well, they can't be right because I'm never polled.
And then it is fun to teach.
I don't know if you find this Dr. Miller and have people's eyes open when you say, oh, just a few thousand people polled and this is accurate.
So how do we, you know, for viewers out there I think, hopefully they'll learn a number of things with this show, but I think the number one thing they could, can learn is that polls can be accurate and they can be fair and they can be unbiased.
So what are the, how do we demystify this idea of polling, Dr. Seltzer?
- Well, the fundamental idea of polling and this is going back to George Gallup an Iowan who said, you don't have to eat the entire pot of stew to know what it tastes like.
You have to draw a representative spoonful from a well stirred pot.
So if you think about that as a researcher, what you want is to gather data from a small group of people, that's your spoonful that represents the meaningful universe.
Typically people who are going to vote, but that's the way in which you go about collecting that it's gotta be a random selection from in a way that everybody who's in that bigger universe has an equal chance to be contacted to participate in the poll.
It doesn't mean that they will be contacted.
It just means that there's an equal probability to do it.
And so in the olden days, when everybody had a landline and literally almost everybody did, and you had one phone number per household and if I knew what your phone number was I knew where you lived.
All of that was sort of the golden heyday of accuracy and polling and since then, all of the technological innovations that have happened have worked against pollsters.
So we have to double and triple and quadruple our efforts to be sure that we're still inviting and able to get that cross section of people into our polls.
- Dr. Miller, so you find this a sort of confusion about polling, where people tell you, oh, it's I don't do polls because they're biased.
Or when you know of course if they're done right, they can be incredibly accurate.
- Yes or I run into the person who knows a little bit and then they think they're an expert, not to be mean, but I think people can be like that about anything, you know, a little and then you think you know more than you actually do, there is a lot that goes into, I mean as you all know, producing a good poll.
I think there's a lot more math than the general public really appreciates.
It is not as simple oftentimes as just talk to a group of people and then you're done.
You have to have an idea of what the population looks like that you're trying to talk to.
Sometimes you know how to, you have to think how to mathematically adjust your data.
There are all kinds of trade-offs as she talked about with changing technology and changing patterns of using phones and so forth and trying to use different modalities of contacting potential survey respondents that I think in some ways people in the industry are trying to figure out as they're doing it.
So it is a complicated, it's a science, there's a bit of art to it but it's a science and I think it's that science aspect is often under appreciated.
- Isn't there also a double standard, so on the one hand people will say, I don't trust those polls.
You know, I don't want anything to do with them.
And then on elect, the day after election day, you know, well, why was this, this poll was three points off.
They said that this person was gonna win and the person might've lost by one point and the poll had him down two.
So there's also these expectations, these incredibly high expectations.
So I guess that's a key thing we should get to right away, which is margin of error, that people need to understand that if you're talking about a poll you've done and you say this, you're not saying this candidate will win.
You are talking about margin of error, right?
- Well, the margin of error is what's related and sort of as the reminder that the business that we're involved in, is a business of estimation.
It's not, we're not in the business of exactitude.
So what we give is our best shot at what it is that we think is gonna happen.
And margin of error, depending on how many people you've interviewed is typically between 3, 4, 5 percentage points, either way around the number that you present.
So when we say that things are within the margin of error, we still probably have a poll showing one person leading and that that's our best estimate and that, but the margin of error would allow for things to change and for things to be slightly different.
- And Dr. Miller, you had a couple of questions regarding margin of error and waiting, and now is not a bad time to ask.
- Oh I think just on the point that was asked, one way I also like to describe that to people I also teach statistics, and polling is a lot of statistics.
So I think another good way to think about the point that she just made is arithmetic tells you two plus two is four.
The statistics that goes into polling tells you with the data that we have, our best estimate is that two plus two is four, but it could be five or three.
And we have a degree of certainty that is in that range.
And I think people are used to zoning in, on that exact number.
They want that exact number instead of thinking about the fudge factor.
So I think the explanation was, was really good there, but yeah and so weighting.
So I guess- - So, what is weighting and then we'll get the details.
- So weighting is, it's like a gym weight not like I'm waiting for you, right?
It's basically it is how much weight, how much affect you want to give a person in your survey on your final data.
The people who answer your surveys don't always look like the population that you wanna talk to.
There are always groups of people who are over and underrepresented.
And so not every respondent counts equally.
Some people will count effectively for more than one person, particularly if they come from certain demographics where we have the most under-representation and if you're overrepresented, you might count for less.
So, that's the weighting aspect.
But the question that I had to get, you know, really your opinion on as someone who is out doing this professionally is, how much transparency about the weighting process is, should we have from pollsters?
You know, I think about the pollsters that we have doing polls in Kansas.
And they'll tell us a little bit like, we used this demographic or that demographic we've factored in partisanship, which is better than nothing, but also it's not really as much as you might wanna know.
So, what's your take on that?
- Well, I'll tell you how we approach and then I'll hopefully remember to come back and answer that more directly, which is we want to, of course there are discrepancies in your data that make it not reflect the population out there and that's biases and who responds and who doesn't respond, as you said.
So you want to wait to known population parameters or I want to anyway.
So even when we're doing a vote ahead of an election, we gather basic demographics on everybody we talk to whether or not they meet our screen as a likely voter.
And that way we have a general population sample, we can go to the census and wait everybody there.
Now, as you know, voters don't vote in proportions that equal the general population.
So if they're overrepresented among pollsters, when we pull out the likely voters, we'll have more of them.
So we'll have more older people for example, and fewer younger people, even though we adjusted the bigger group.
So we like that because it's not looking backward to what happened in a previous election.
There are lots of pollsters who do that.
They have a likely voter model and it's modeled based on past data.
So what they're really doing is predicting a past election.
I wanna be in a position where my data will show me what this future electorate looks like.
So I don't wanna make any assumptions about that and weighting by partisanship, I mean, I'm delighted that people will disclose that they do that, but there are some beautiful charts out there that show there's no consistency from pole to pole to pole about what is the right mix of partisanship.
And in Iowa, in particular, if you just go by how people are registered, well, they don't vote in those numbers.
So if you have adjusted your sample to look like the secretary of state's master voter list, you will under-represent Republicans because they are disproportionately likely to vote.
So we try to make zero assumptions about what this future electorates looks like so that our data will reveal when it's different from what we might've expected.
- So, if I could maybe ask you to expand on that a little bit of a different way.
So, maybe for the viewers, it might be interesting to say that if I'm going to do a survey of American adults, the census tells me what American adults look like, so I might know that 13% of the population is African-American, but if I only have say 5% of my respondents are African-American, I know I need to weight Black respondents up, but we don't have that perfect data about voters.
So when you're talking to voters to build this idea of what you think that electorate is going to look like, what does that process look like for you?
What are you asking?
What are you looking for?
- Again, I'm not deciding what this future electorate looks like, and I'm making no guess as to what that's going to be.
So I let the data reveal to me, for example, in 2016, that the African-American vote while we're talking about it was going to decline compared to the way it had been in 2008 and in 2012, when Barack Obama was running and it, you know, post-hoc it makes all kinds of sense that that vote might decline.
But there were a lot of pollsters out there who were making their 2016 data look like 2012.
And so they were off by a significant factor because the African-American vote in particular declined and that's highly correlated with democratic votes.
- So then when you develop the weights for your surveys, what are you typically weighting?
- We will weight by age and sex and sometimes depending on what state we're working in, race and by geography within the state, but it's all again at the general population level and a part of the state, it's more likely to vote when we pull from that larger sample of maybe a thousand people and it ends up being more like 750 likely voters, then that part of the state will be overrepresented among the likely voters that we pull out.
So our method sort of tells us what's happening with the electorate rather than us telling our data, make the numbers look like this.
- Let me jump in here again, going to- - It's getting pretty nerdy, isn't it?
- No, I'm loving it.
(Indistinct) There's gonna be a viewer who is loving the conversation, but it's still gonna be sitting there saying, well, I don't answer those phone calls, which we're gonna get to in a second whether it may be a phone call or not but we're still gonna, I get that all the time.
Well, I don't even answer that and tell me, you know, I tell my wife, "please answer all the phone calls."
'Cause I'm a nerd and I wanna know who's polling, who's out there as the Democrats, Republicans it is interest groups.
Sometimes I'll Google that whoever's paying for the poll.
But beyond that, you know, I also tell people no answer that phone, your opinion will be now part of that poll.
So what would you say to a viewer, and I'm sure there are many right now saying I don't even answer those phone calls from pollsters?
- Well, I encourage them to rethink that, (speakers laughing) number one, but polling is a bit of a, it's a bit and in my world at lot, a big bit of democracy.
Is a chance for your voice to be heard in a way that elected officials and politicians running for office listen to.
And so if you're not, if you're sort of saying, well, I'll keep that to myself.
You're not participating in what can be an important piece of democracy, but I will say that non-response is the biggest problem that we face.
And if you, if a pollster tells you they don't worry about it every night, they're kidding themselves and probably you as well - So you mentioned that landlines, some of the early, famous early polls were done by newspapers and not so sure they did so well, whether you, they clipped things back and wrote on them.
And I still get polls from elected officials through the mail saying, send back this survey, sorry, drives me nuts.
I said this isn't scientific, but how it used to be landlines, now how do we reach people to get an accurate poll start with Dr. Seltzer.
- Well, up until 2008, we were able to just focus on landlines and we could ignore everything else.
And then the world changed because people started dropping their landlines with the recession and wanting to control household costs and cell phones have, we've been able with the technology to be able to overcome what those issues were in gathering that.
So you can still do a landline and cell phone sample and reach pretty much the entire population.
There are other polling companies that will set up a group of people, a panel of people who volunteer to be polled repeatedly.
I can't say I'm a fan of that, even though some of them do very well.
It just, the science behind it doesn't sit as well with me.
- Right, I think sometimes they don't necessarily account for attrition in those panels, very well - And some text messaging, I'm not a big texter, but apparently that happens, polling through text messaging.
You were commenting on that.
- So, I mean, it's something I've definitely noticed here in Kansas.
So with our polls, for the US Senate last year, about half of the polls that we saw of that race, sampled respondents or interviewed them entirely by text message or part of the sample by text message, which to me, I mean, having, you know, learned about polling back when everything was landline and being very focused on research, I look at this and I say, well, we don't necessarily know enough about the text message sampling to know is that really a reliable way?
But I think it's something that a lot of pollsters are doing to save costs and maybe explore.
I don't know what your take on the text message polling would be, right?
- Well, the problem that I have is whether you can put together a sample of phone numbers that will be text message friendly, that will represent the whole universe.
So if you know just in your ordinary life, most people who have a phone answer it from time to time, but not necessarily do people want a text from an unknown source.
So that it's just, to me, it sounds like it might be easier and faster and shorter, but are you really getting the proper sample that you need?
- Now Dr. Seltzer, you're very famous for your Iowa Caucus polls.
And I should say that it's impressive to me because the Iowa Caucus, as many of you know, is quite an interesting animal in American politics predicting even turnout or not, maybe that's what you don't do, but predicting, trying to figure out turnout in a caucus is so difficult.
Is there still a role for the Iowa Caucus going forward in American politics?
I'm not asking you to say it should or should not happen, but do you see, do you see a role for this very unique system in Iowa?
- Well, it's a party organizing activity and then it became famous.
So it, the structure of it was put in place before people like you started to become interested.
And the idea that you can come to a state and in a day meet 12, 13 candidates, it keeps the national press interested in it, but it was designed for a group of neighbors to sit in the living room and make some decisions.
It wasn't designed for 1500 people in a high school gym and auditorium to be able to do that.
So the decisions for that lie somewhat in New Hampshire and somewhere with the national parties.
And we'll see what happens.
- Dr. Miller, do you see the caucus system surviving in America?
- I mean, my gut response is abolish all caucuses everywhere, purely because I think they're very undemocratic.
The regular voting process of just showing up to the polls is skewed anyway, in terms of who that process advantages in many kinds of ways.
And so to then have a system where you're saying, we're gonna have this method of voting where you might be here for eight or 10 hours and it has to take up your entire day.
And then it gets very complicated, is just compounding all of those problems.
Like how do working families afford to do that and so forth?
- I'm gonna have to move on.
We just got a little bit of time, but my public service message to the viewers is, you know, engage in polls, answer the phone and use your smarts.
And if you think, hey, this is a candidate up to something, you can probably figure it out and get, you know, get educated on polling and make your voice heard.
But each of you have roughly about a minute, what is it that you want people to know about public opinion or our polling?
Dr. Seltzer, we'll start with you.
- Well, just that it's meticulous work and the people who are doing it mostly take it very, very seriously, but we rely on the kindness of strangers.
That is our, we are only able to be successful if somebody agrees to pick up their phone and stay with us.
So that's the tricky part.
So be kind.
- Dr. Miller?
- And I would say, it's interesting, Dr. Seltzer talked about accountability.
Pollsters, most pollsters are not gonna continue to get business if their data are bad, the typical pollster is not trying to mislead you.
And I know that there is a lot of distrust of all kinds of social and political institutions these days, but it's really is because it's a business.
It's one of those elements of politics where there really is fallout to doing it poorly.
So I think that can be a difficult area for people to shut down that mistrust and skepticism.
But I think people, I would hope people would learn to do that more.
- Well, it's been a real treat to have both of you here.
I personally could go on for a couple more hours, but maybe we'll just organize a conference or a seminar and have you be the speaker and Dr. Miller can be the moderator, but thank you so much.
- Thanks for having us.
- My pleasure.
- That's all the time we have for this episode of IGI.
If you have any comments or suggestions for future topics, send us an email at issuesatktwu.org.
If you'd like to view this program again, or any previous episodes of IGI, visit us online at watch.ktwu.org.
Thanks so much for watching.
(calm music) - [Announcer] KNEA empowering educators, so that educators can empower Kansas students.
- [Announcer] This program is brought to you with support from the Lewis H. Humphreys Charitable Trust and from the Friends of KTWU.

- News and Public Affairs

Top journalists deliver compelling original analysis of the hour's headlines.

- News and Public Affairs

FRONTLINE is investigative journalism that questions, explains and changes our world.












Support for PBS provided by:
KTWU I've Got Issues is a local public television program presented by KTWU