Pitfalls of Opinion Polls: A Case Study

by Jay Johansen
Pregnant Pause Home Statistics Search this site

America these days often seems driven by opinion polls. What politician today dares make a move without first consulting the polls? What interest group does not proudly trumpet the results of any poll indicating the public agrees with them?

But polls can sometimes be very difficult to interpret, if not downright misleading. A press release issued in January, 2001 by the Gallup organization about their polls on abortion makes an interesting case study. For if we take people's responses to this poll literally, they display some curious contradictions. Either side in the debate could select questions to indicate that the public supports them.

Note: This article started out as an article about public opinion on abortion, but as I studied the results of this set of polls, the article quickly turned into a discussion of what can go wrong with a poll. So we're pioneering multi-purpose articles: you can read this as an article about pitfalls of polling, or you can read this as an article about public opinion on abortion.

Where do you stand on the issue itself?

The questions which told us the most about what Americans think about abortion itself were these: "Do you think abortions should be legal under any circumstances, legal only under certain circumstances, or illegal in all circumstances?" For those who answered "only under certain circumstances", they then asked a follow-up question: "Do you think abortion should be legal in most circumstances or only in a few circumstances?". Combining the results of these two questions gives the following results:

Legal under any circumstances Legal in most circumstances Legal only in a few circumstances Illegal in all circumstances No opinion
Do you think abortions should be legal under any circumstances, legal only under certain circumstances, or illegal in all circumstances? / Do you think abortion should be legal in most circumstances or only in a few circumstances? 28% 11% 38% 19% 4%

Note: In the graphs, I have consistently shown the pro-life position in green, the pro-choice position in red, and in-between positions in yellow.

These results would seem to be encouraging to the pro-life side and disappointing to the pro-choice side. Only 28% support the present law -- abortion legal at any time during pregnancy for any reason. 19% believe abortion should never be legal. If these people meant this literally, that would mean that they are more anti-abortion than Right to Life, Christian Coalition, or Pregnant Pause! All the major pro-life organizations I know of concede that abortion should be legal when it is necessary to save the life of the mother, but according to this poll almost 1 in 5 Americans would not permit even that exception.

But now compare this to the answers to some of the other questions.

How do you label yourself?

Another question:

Pro-choice Pro-life Mixed/Neither Don't know what terms mean No opinion
With respect to the abortion issue, would you consider yourself to be pro-choice or pro-life? 47% 45% 3% 2% 3%

This would seem to indicate that the country is very nearly split down the middle. Neither side would be particularly excited about this result.

Compare these responses to the "circumstances" question. Even if we add together all the people who said they believed abortion should be legal in "most circumstances" with those who said it just be legal in any circumstances, that still totals only 39%. That leaves a significant number of people who believe there should be serious restrictions on abortion, but who described themselves as "pro-choice".

How can we explain this? One could offer a number of theories. It is possible that people simply didn't understand the questions or some such, but let's work on the assumption that the people questioned weren't simply stupid or irrational. The best theory I can come up with is that many people who call themselves "pro-choice" understand that term very differently than either Planned Parenthood or Right to Life define it. Maybe they think "pro-life" means a belief that abortion should never be legal and anything else is "pro-choice".

A side note: Pro-choice people frequently try to portray abortion as a men vs women issue. You often hear statements to the effect that a politician who takes a pro-life stand is alienating potential women supporters and such. But breaking down responses to the above question by gender gives:

Pro-life Pro-choice
Men 46% 47%
Women 45% 47%

That is, the difference between the positions of men and positions of women on this issue was on a level that gets lost in the rounding errors.

Do you think the law should be changed?

Another question:

More strict Less strict Remain as they are Other No opinion
Would you like to see abortion laws in this country made more strict, less strict, or remain as they are? 34% 17% 46% 1% 2%

This question is surely very encouraging to the pro-choice side. The present laws are very close to what they see as the ideal, and only 34% want to change them in a pro-life direction.

Yet these results are highly inconsistent with the first two questions. Present U.S. law says that abortion is legal at any time for any reason. The only current restrictions are that tax money cannot be used to pay for abortions in most cases, and some states do not allow an unmarried, under-age girl to get an abortion without her parents being notified. According to the first question we discussed above, 68% of Americans believe there should be at least some restrictions on abortion. Yet according to this question, only 34% say they want the law to be more strict than it is now. How can 68% say they think the law should say something very different from what it says now, but only 34% want the law to be changed?

My theory is that many Americans don't know what current U.S. law is. They think that there are more restrictions on abortion than there really are. Thus many will say that they basically support present law because they think it says something closer to what they want than it really does.

This is just a guess based on the available information. Feel free to offer alternative speculation yourself.

One specific

This poll asked one very specific question. That was about partial-birth abortion.

For Against No opinion
Would you vote ... for or against a law which would make it illegal to perform a specific abortion procedure conducted in the last six months of pregnancy known as a "partial birth abortion," except in cases necessary to save the life of the mother? 63% 35% 2%

All the leading pro-choice organizations have come out strongly to defend partial-birth abortion. Yet comparing the numbers in this survey, at least 16% must have said that they call themselves "pro-choice", and at the same time said that they think partial-birth abortion should be illegal. How can you associate yourself with a movement while saying you disagree with an important, highly vocal position taken by every leader in that movement? Very puzzling.

Should the Constitution be changed?

Gallup issued this press release with the headline, "Majority of Americans Say Roe v Wade Decision Should Stand". This was indeed an accurate reflection of the response to one question, but taking their surveys as a whole, it was very misleading.

Yes, when they asked about Roe the response was:

For Against No opinion
Would you vote ... for or against a constitutional amendment that would overturn the Roe vs. Wade decision, and make abortion illegal in all states? 30% 67% 3%

This response is pretty consistent with the "more strict / less strict" answers, which makes it equally inconsistent with the "circumstances" answers. 34% of Americans say there should be more restrictions on abortions, and 30% of Americans think Roe v Wade should be overturned. Taken by themselves, that sounds rather plausible. But compared to the "circumstances" question, it makes no sense, for the same reasons that the "more strict / less strict" answers made no sense.

One possible explanation in this case is to observe that the question did not ask simply if Roe v Wade should be overturned, but added the detail that this would make abortion completely illegal. As this poll itself showed, most Americans believe there should be more restrictions on abortion than there are now, but don't think it should be completely illegal. Thus, someone who does not think abortion should be completely illegal might answer "no" to this question, even if they think Roe should be reversed. Futhermore, the question specifies that this would be done by a Constitutional amendment. Some people oppose abortion but believe that a Constitutional amendment is the wrong way to go about ending it: they are cautious about tinkering with the Constitution.

Or perhaps this question suffers from the same problem as the "strictness" question: Most Americans do not know what the Roe v Wade decision really said. They don't know that it makes abortion legal at any time during pregnancy for any reason. Thus, even though they disagree with what it really said, they don't call for it to be overturned because they don't know that that's what it said.

Pitfalls of Polling

Thus, this poll leaves us with some baffling inconsistencies. When asked about circumstances, soemwhere between 57% and 68% of Americans give a pro-life answer, while only 28 to 39% give a pro-choice answer. When asked how they label themselves, it's 50/50. When asked if they would change the law or overturn Roe v Wade, only 30% or so give a pro-life answer and over 60% give a pro-choice answer. So where do Americans really stand?

Let's take a look at what might have caused these inconsistencies. This is not intended to be a complete list of everything that could be of concern in a poll. Rather, I am using this particular poll as a case study, and just looking at the issues that it brings up.

Sampling Error

When polls are quoted in the media, they often include a statement that the poll has a margin of error of plus or minus 3% (or whatever number). A thoughtful person might well ask, What does this mean? It sounds like it means that the most that this poll could possibly be misrepresenting the true opinions of the public is by three percentage points. But common sense tells us this can't possibly be true. If you ask a few hundred people their opinions, it is surely possible that you just happened to find the only people in the country who feel that way. Or maybe many people misunderstood the question. Maybe the poll was biased. How could you measure these things at all, never mind confidently say that the biggest possible error is 3 or 4%?

Let's begin with the first objection. You often hear people dismiss polls with a statement like, "Yeah, maybe that's what a bunch of people said, but the next few hundred people might have said just the opposite." This is called "sampling error". It is not a fair criticism of most polls today.

True, amateurs who take their own polls often make serious mistakes in sampling -- either out of ignorance, lack of resources ... or sometimes a deliberate desire to bias the results.

A very small poll is unreliable. To take the extreme, if you asked three people who they planned to vote for in an upcoming election, and two said Senator Smith and one said Governor Jones, it would not be reasonable to conclude that Smith will get 67% of the vote. You have questioned far too few people to get a reliable answer. But this problem is easily solved by questioning enough people. Most professional polls today question about 1000 people. It can be proven mathematically that if you question a few hundred to a thousand or so people, the chance that you just happened to pick the only people in the country who feel a certain way is in fact very small. The "margin of error" quoted in most polls actually means that there is a 95% chance that questioning more people wouldn't change the results by more than the stated percentage. That is, a margin of error of, say, 4%, means that there is a 95% chance that questioning more people wouldn't change the results by more than 4%.1

This assumes that you really pick people at random. I once heard the host of Pacifica Radio, a left-wing radio program, say that the vast majority of Americans supported certain policies -- a conclusion she based on the opinions expressed by people who called in to her program. But of course, people who listen to Pacifica Radio are no more representative of the general public than people who listen to Rush Limbaugh. I wouldn't rely on callers to either as a fair sample of the American public.

Some sources of bias can be very subtle. For example, most polls today are conducted by telephone. Suppose you take a poll asking people how much they travel. People who travel more often may be more likely to be in places where their cell phones don't work. So a telephone poll on this question would be very tricky. And how do you know when people who don't own telephones or are frequently away from their phones may have different opinions than the rest of the population? Are people who spend more time away from home just as likely to be Republicans as Democrats? The answer to that question is not at all obvious; one could engage in all sorts of speculation.

That said, professional pollsters continually struggle with how to make sure that the people they question really are typical of the American public as a whole. This is a problem that is routinely and seriously addressed, and it is unlikely that it is a major source of error in a professional poll.

To their credit, Gallup's press release included a much more explanatory statement than the simple "margin of error is x". They said, "For results based on this sample, one can say with 95 percent confidence that the maximum error attributable to sampling and other random effects is plus or minus 3 percentage points. In addition to sampling error, question wording and practical difficulties in conducting surveys can introduce error or bias into the findings of public opinion polls."

Yes, it's possible that they questioned 1000 people and they somehow just happened to get the only 1000 people in the country who felt that way, but this simply isn't very likely. "Sampling error" is really the least of what can go wrong with an honest poll.

Vague wording

A vague question can give only vague information.

This poll asked people if they believed abortion should be legal in "all circumstances", "most circumstances", "only a few circumstances", or "illegal in all circumstances". "Always legal" and "always illegal" seem clear enough, but what about "few" and "most"? Exactly what are the "few circumstances" or "most circumstances" that these people believe justify abortion? The pollsters didn't ask. Perhaps these people meant that they would allow exceptions for just a small number of truly hard cases -- rape, for example. Or perhaps they would exclude only the most frivolous reasons -- like a woman who suddenly decides to have an abortion because someone made fun of her shape.

This problem could easily have been avoid by asking about specific circumstances. It would have been easy to list some fairly obvious ones: life of the mother, health of the mother, rape, incest, deformed child, perhaps a few more. Other polls have been more specific about exactly what circumstances, such as Wirthlin's polls, and have given much more meaningful results.

Inadequate choices

Closely related to vagueness is when a question does not give the person a fair range of choices. This is something of a problem in the "circumstances" question in this poll, which, as I say, could have been much improved if it had given more choices and been more specific.

In some polls this is a much more serious problem. A few years ago I saw a number of news stories reporting that a recent poll had found that a majority of Americans opposed a tax cut. I found this hard to believe, and was left wondering if I was blithely assuming that just because I favored a tax cut, that surely almost everyone else did, too. Then I found one news story that gave the actual wording of the question that was asked. It was, "Which do you think is more important: reducing the deficit, or cutting taxes?" There was no option in the question to say to do both and cut spending to make up the difference, or to do neither for that matter. It was a strict either/or. A majority said that reducing the deficit was more important, and thus the media reported that Americans did not want their taxes reduced.

On the lighter side, I once came across a gag survey in a British magazine about Scottish independence. One of the questions was, "In the past -- and probably still in the present -- there was a sinister alliance between France and Scotland to destroy England. Do you believe that, a) the French are more shiftless and untrustworthy than the Scots, b) the Scots are more shiftless and untrustworthy than the French, or c) there is nothing to choose between the two?"

Over-specific wording

At the other extreme, an excessively specific question can make it unclear whether the person is responding to the general principle or the details.

Consider the question about reversing Roe v Wade. It did not ask simply if Roe v Wade should be overturned, but goes on to say that this would be done by a Constitutional amendment that would "make abortion illegal in all states". It's difficult to say exactly what part of such a question a person latches onto as important. Someone who is strongly anti-abortion, who considers this the most important issue of the day, might well ignore any qualms they have about details and simply say "yes". But there are many people who oppose abortion in general but would make exceptions in hard cases, or who simply don't consider it a top priority issue justifying drastic measures. They might feel this question represents an extreme view, and be reluctant to say they support such a measure. Yet the same people might support reversing Roe if it was done without a Constitutional amendment, or if it left it up to Congress or the states to decide what reasonable restrictions on abortion might be. From this question, it's hard to say.

This problem could easily have been avoided by simply asking, "Do you think that the Supreme Court's Roe v Wade decision should be reversed or overturned?". Of course that still leaves other problems, like ...

Assumed knowledge

A question may include hidden assumptions about the person's knowledge. Suppose you ask, "Should the U.S. send troops to Country X?" This implicitly assumes that the person knows what is going on in Country X and why the U.S. might want to send troops there.

In this poll, the questions about making abortion laws more or less strict and overturning Roe assume that the person answering knows what present law is and knows what Roe v Wade really says. As I've noted, I suspect this was the major source of apparent contradictions in this poll.


How people respond to a question may be very dependant on exactly what they understand certain words to mean in context.

This poll asked a very straightforward question about whether people considered themselves "pro-life" or "pro-choice", and got results inconsistent with questions asking their views more specifically. I suspect the problem here is that people do not define these terms the same way that activists on either side as well as the media define them. In this case the problem was at least apparent, because they asked specifically about these labels and asked other related questions that can be used for comparison.

But in other cases, the inaccuracy caused by hidden assumptions can be more subtle. Gallup recently conducted a poll on school vouchers where they asked two slightly differently worded questions. Half the people were asked if government funds should "pay for tuition at a private school". 48% said yes, 47% no. The other half were asked if government funds should "pay for tuition at the public, private or religious school of their [the parents'] choice". 62% said yes, 36% no. At first I found this result surprising. I thought that there would be at least some number of people who would agree to vouchers paying for education at a secular private school, but not at a religious one. But in fact specifying that religious schools would also be an option increased the positive response. Could it be that the difference was due to the image that the words brought up in people's minds? Like, when you say "private school", they think of some exclusive preppy school for rich kids, and say no, their families can pay for that themselves. But when you say "private or religious", they think of the Catholic school down the street that lots of their friends' kids go to, or the mission school downtown that caters to poor minority children, and they think, sure, these people all need some help with tuition.

This was not an isolated case. Pollsters have often found that seemingly trivial changes in wording can dramatically alter the results of a poll. Ask people if the "government" should pay for some program and you get a much more positive response than if you ask if it should be paid for "with your tax dollars". For people who have firm opinions about an issue, such differences in wording probably don't change much. But most people don't have firm opinions on any given issue, so slight changes in wording can make them perceive what you are asking as being a bit more extreme than they can support, sometimes even tilt them one way or the other.

A basically fair poll

Note that all these problems come up in a poll which does not give any obvious signs of bias. I don't think Gallup was trying to slant the results of this poll; by all appearances they did their best to make it fair.

For example, notice that they asked people whether they described themselves as "pro-life" or "pro-choice". They offerred the terms for people on each side that those people themselves prefer. They did not ask people if they were "pro-choice" vs "anti-choice", or "pro-life" vs "pro-abortion". While I find the results of this question inconsistent with other polls and my understanding of the meanings of the terms, nevertheless I cannot call it biased. I don't think the answers mean what on the surface they sound like they mean, but the question was fairly asked.


Does this mean that polls are worthless? No. Even though I have many criticisms of this poll, I nevertheless find it useful and informative in many ways.

The point is that a poll must be interpreted carefully. A reporter's summary of what a poll says is almost worthless. If you don't know exactly what was asked and what the answers were, it is practically impossible to tell what the poll really meant. Especially on highly controversial issues like abortion, a reporter's own opinions are likely to shade how he reports on a poll. In this case, Gallup themselves headlined the press release, "Majority of Americans Say Roe v Wade Decision Should Stand". A pro-life reporter could, with equal justification, have headlined a story on this poll, "Majority of Americans say there should be more restrictions on abortion". (In fairness to Gallup, their press release did highlight the fact that there has been a steady increase in the percentage of Americans who call themselves "pro-life".)

Polls must be analyzed carefully to see what they really say. As a pro-life activist looking at this poll, I see a number of interesting possible conclusions. For example, I see that many Americans who support significant restrictions on abortion nevertheless call themselves "pro-choice". I must ask, Why? Are they confused about what pro-lifers and pro-choicers really say? Does the pro-life movement have a bad reputation, that even people who basically agree with us don't want to be associated with us? Etc. Or: I've theorized that some of the inconsistancies in this poll might be because people don't know what present U.S. abortion law really is. Perhaps it would be valuable for pro-lifers to try to make people better informed on the legal questions -- we might turn people who think they are satisfied with the present situation into allies. I'm sure there are similar lessons that might be learned by a pro-choice activist. From their point of view, the strong support for banning partial birth abortion might indicate that defending this practice to the last man might be a mistake on their part; perhaps if they agreed to ban this procedure they could paint themselves as more moderate, and shore up their support. And so on.

But whenever you hear someone pull one question out of a poll and, without even telling you exactly how the question was worded, loudly proclaims, "See, the vast majority agree with me" ... well, I'd take that with a grain of salt.


1. If you want to impress your friends with your statistical prowess, here's a good approximation to the formula for computing the margin of error based on sample size. That is, assuming that you pick people completely at random, this formula gives a measure of how much your poll might off purely because the people you questioned turned out not to be typical. For a yes/no poll, there is about a 95% chance that the error will be less than 100% divided by the square root of the number of people questioned. For example, if you questioned 16 people, the square root of 16 is 4, 100% divided by 4 is 25%, so there is a 95% chance that your results will not be off by more than 25%. Well, that wouldn't be very encouraging. Most professional polls today question about 1000 people. The square root of 1000 is about 32, 100% divided by 32 is about 3%, so most polls today have a "margin or error" of 3%.

Pregnant Pause Home Statistics Search this site

Posted 31 Jan 2001.

Copyright 2001 by Pregnant Pause
Contact us