Those 8 polls on the Muslim Ban: What Public Thinks

iMediaEthics publishes international media ethics news stories and investigations into journalism ethics lapses.

Menu

Home » Media Polls»

(Credit: http://nyphotographic.com/)

On Tuesday, Huffpollster listed the eight major media polls that have attempted to measure the public’s reaction to President Donald Trump’s ban of people from seven Muslim-majority countries, frequently called a Muslim ban. The Executive Order explicitly indicates the ban is temporary, and all but one of the eight polls noted that fact in the questionnaire. And all but one poll also mention Trump’s name. Neither factor appears to be important in the results.

The range of opinions varies from a 9-point margin of approval for the ban to a 13-point disapproval, a net swing of 22 points.

Huffpollster has posted several columns (here and here and here) suggesting reasons for poll differences on the issue, with only one suggestion marginally supported by plausible evidence – that live-caller phone polls are more likely to measure disapproval than polls conducted online or by automated phone calls (referred to as Interactive Voice Response or IVR).

Those three live-caller phone polls – conducted by CNN/ORC, CBS, and Gallup – all show a majority opposed to the ban, by 6, 6, and 13 points, respectively.

However, SurveyMonkey, an online poll, also shows a majority opposed, but by 4 points, and PPP, a combination online and IVR poll, reports a 2-point margin against the ban.

Three of the eight polls, all online or a combination of online and IVR polls, by Rasmussen, Reuters/Ipsos, and HuffPost/YouGov, find more people in favor of the ban than opposed by 9, 7, and 4 points respectively.

Thus, the trend does suggest that the live-caller phone surveys may tend to tap a more negative response than the other polls. However, with just eight examples, we can’t have much confidence that any patterns we see are much different from what might occur by chance.

 

Measuring Non-Opinion?

The one common attribute of the polls is that they all asked a “forced-choice” question – one which asked respondents if they favored or opposed the ban, but did not explicitly offer a “don’t know” or “no opinion” alternative.

This is an important point, because it’s quite likely – based on the whole history of polling* – that large numbers of people simply haven’t formulated an opinion on the issue, and that “forced-choice” questions distort the measure of public opinion.

In one of its posting, HuffPollster tries to make the case that online and IVR polls do in fact provide for a measure on non-opinion:

“And there’s another difference that could help to explain the noticeable gap between modes. While some pollsters, especially those using online or automated surveys, explicitly give respondents the option to say they’re not sure about a question, others, especially those using live interviewers, will accept that response only when it’s volunteered.”

This is a highly misleading statement. It is true that online and IVR polls usually have a “no opinion” option. That’s necessary, because if a respondent genuinely doesn’t want to answer, and there is no “no opinion” option, the respondent will just stop the survey (online) or hang up (IVR).

With a live caller, if respondents don’t want to respond, they can tell that to the interviewer and the survey will continue.

That’s why IVR and online polls offer a non-opinion option. But that doesn’t mean respondents are much more likely to take it.

So it’s important to remember that the questions posed to the respondents in the online and IVR polls were still in a “forced-choice” format, which pressures the respondents to come up with an opinion, regardless of whether they have thought much about the issue.

Then for the IVR poll, the voice will follow the question by saying, “Press 1 for agree, Press 2 for disagree.” Depending on the pollster, there could be a couple of seconds before the voice then says, “Press 3 if you are unsure.”

If the third option is delayed long enough, most respondents will not hear it, because they will have already pressed 1 or 2. The longer the pause, the less likely the third option will be chosen. That’s why IVR polls can obtain such high numbers of respondents expressing an opinion, even though eventually offered a “no opinion” alternative. (See below.)

In the case of the online poll, a similar process takes place – except that the question and responses are offered in a visual format. Following the initial “forced-choice” format question, the screen will display the two major responses: agree (favor) or disagree (oppose). Somewhere below these two choices will be the “no opinion” option. Its placement could affect the likelihood of its being selected.

The main point here is that online and IVR polls do not really measure non-opinion when they ask forced-choice questions, because the question itself seems to demand an answer. Evidence for that conclusion can be found in a comparison of the percentages of people who offered an opinion on the immigration ban question among the eight polls.

polls


Yes, that number for CNN/ORC is correct. On the immigration ban question, CNN
found virtually all Americans with an opinion worth reporting. On some other questions related to the issue, they found “only” 98% to 99% with meaningful opinions.

As implausible as the CNN figure might seem, the percentages for rest of the seven pollsters are beyond belief as well. It doesn’t matter what mode is used – the notion that nine of ten (or more) Americans are following politics closely enough to have formulated an opinion on this issue is simply not supported by past experience.

As demonstrated in previous posts on this site – on issues as widely divergent as the use of bathrooms by transgender people, whether Merrick Garland should have been confirmed for the Supreme Court, views on the nuclear deal involving Iran, and public support for the “bailout” of the auto industry – questions that offer a “no opinion” alternative generally reveal anywhere from a third to half of the public selecting that option.

That so many people are unengaged in the day-to-day policy arguments may be shocking to people who think American democracy depends on widespread public engagement. But it’s a fact that needs to be recognized – despite what the media polls would have us believe.

 

__________________________________________

*Polls that specifically try to measure non-opinion and weak opinions will show large numbers of the public unengaged on any given issue. Among the many studies that explicitly address this issue, I recommend Daniel Yankelovick, Coming to Public Judgment: Making Democracy Work in a Complex World (Syracuse University Press, 1991), George Bishop, The Illusion of Public Opinion: Fact and Artifact in American Public Opinion Polls (Rowman & Littlefield, 2004), and my own The Opinion Makers: An Insider Exposes the Truth Behind the Polls (Beacon Press, 2009).

Submit a tip / Report a problem

Those 8 Polls on the Muslim Ban: Why Non-Opinion Matters

Share this article: