How Reliable Are Surveys?
GUY RAZ, HOST:
And if you're just tuning in, this is WEEKENDS on ALL THINGS CONSIDERED from NPR News. I'm Guy Raz.
Have you ever received an unsolicited phone call from someone asking you questions about your politics or your buying habits or your likes and dislikes? Well, those surveys have long been important tools for corporations and political campaigns. But here's the thing - did you ever refuse to answer those questions or just hang up altogether?
Well, you're not alone. As a matter of fact, you're in the vast majority. Reporter Will Oremus has been covering that story for Slate.
WILL OREMUS: Pew Research did a study of their own studies recently. It was sort of a meta study, if you will. And they found that only nine percent of the people they tried to reach for their public opinion surveys actually respond.
RAZ: Nine percent.
OREMUS: That's right.
RAZ: Is it different? I mean, did people respond to surveys in greater numbers 10 years ago, 20 years ago?
OREMUS: They did. In fact, in much greater numbers. Just in 1997, when Pew did this same methodology survey, they found that 36 percent of the people they were trying to reach actually responded.
RAZ: So why is it so low now? Why just nine percent?
OREMUS: There are a variety of reasons, but one of the main ones is the increasing prevalence of cell phones. People don't like to answer cell phone calls from numbers that they're not familiar with. I don't know if it's the prominence of the caller ID display, or just the fact that we're really sick of intrusions, but we don't answer cell phones like we used to answer land line phones.
And beyond that, when you get a hold of a youngster on a cell phone, they're less likely to have an adult to pass the phone to, and so it becomes more difficult to get the same number of respondents for that reason as well.
RAZ: As your article points out, there are specific kinds of people who tend to be more likely to respond to pollsters, who will sit down on the phone and answer questions for 10 minutes. I mean, I will happily admit that I have never answered a poll. Perhaps I'm not doing my civic duty.
OREMUS: You're part of the problem.
RAZ: Right. But, I mean, you know, I don't have time. I've got two small kids. So who is answering those questions?
OREMUS: You don't know what type of people exactly are and aren't responding. And - so that's what this survey by Pew is trying to get at. They're trying to figure out, OK, who is it out there that is not responding, and why is that and how could that skew our results? Now, some of the things that they found are things that make a lot of sense intuitively.
For instance, the people who do respond to surveys are far more likely to engage in volunteerism than those who don't take the call. So now that they know that, they can look out for that in their future poll results. But my point in this article was that there may be all sorts of other variables that go along with your willingness to answer or not answer a pollster's call that could skew the results of those surveys.
RAZ: Will, as a journalist, I am sure that you have used survey data in your stories. I know I have.
RAZ: Right? And now I'm wondering if I should at all. I mean, are they entirely misleading?
OREMUS: Well, here's the reassuring thing. So polls today may not be quite as reliable as they were 15 years ago when people were willing to answer the phone, but they're probably still more reliable than they were, say, 30 or 40 years ago when the most reputable polls were conducted door to door because there was a concern that not enough people actually had land line phones.
So it's not that polls today are worse than they've ever been before. It's just that they're not quite as reliable as they were during a sort of golden age that was a byproduct of the prevalence of land line telephones in the '90s.
RAZ: That's Will Oremus of Slate. Thank you so much.
OREMUS: Thanks very much. Transcript provided by NPR, Copyright NPR.