Earlier this month, Public Policy Polling issued a poll news release, which claimed that more than half of President Trump’s supporters agree with this statement: “The Bowling Green Massacre shows why we need Donald Trump’s executive order on immigration.”
There was not a Bowling Green massacre, of course, though the suggestion that such an event occurred, and that it helped to justify the president’s immigration ban, came from Trump’s own counselor, Kellyanne Conway. Later, she acknowledged her “error.”
Nevertheless, the PPP poll finding caused distress for Natalie Jackson, who oversees HuffPollster and writes a column for Huffington Post. In her critique, Jackson argued that “the poll question was a setup, and breathlessly promoting its results will only make distrust in the media worse.”
Jackson had previously excoriated PPP because, she wrote, it “traffics in attention-grabbing but dubious questions.” Included in that condemnation was a PPP poll that found 41% of Trump supporters in favor of bombing Agrabah, the fictional country in Aladdin.
These PPP “attention-grabbing” polls may include some dubious questions, but the results highlight the ignorance of the American public, and thus the tenuous nature of most public opinion polls, something that most pollsters – including HuffPollster – don’t like to acknowledge.
Surprisingly, Jackson does admit in principle that most people may not be engaged in public policy issues, and offers some excellent advice. In her most recent column, she writes:
“First, most of us who read polls live in a world focused on media and politics, and we should get out of that mindset.
“The average American doesn’t spend hours of their day glued to Twitter and cable networks watching all the latest developments in Washington, D.C. Most are busy living their lives, caring for families, working non-political jobs or all of the above…
“Not knowing about the issue doesn’t make people stupid, either. The pace of news in the last few weeks has been extremely fast. People with nonpolitical lives can’t be expected to keep up.”
So, what happens when respondents with nonpolitical lives who can’t be expected to keep up are asked to give an opinion on an issue about which they have little information? After all, given what Jackson has (accurately, in my view) described above, that situation is likely to be a frequent occurrence.
Jackson points to research suggesting that in such situations, a few respondents will admit to having no opinion on the issue, but many will feel pressured – in the context of an opinion survey – to come up with an opinion anyway. So, they say the first thing that comes to mind.
“This is why polling on specific issues is difficult – people often haven’t given issues a lot of thought, but when prompted, they will make up an opinion.”
She expressed that same view another way, in her commentary about PPP’s Agrabah poll:
“When a poll question puts someone on the spot, there’s a chance they’ll end up expressing an opinion whether they have really thought it out or not, and whether it’s about Syria or Agrabah.”
So far, Jackson and I completely agree.
- We agree: Few people pay close attention to most public policy issues, because they have lives to lead.
- We agree: When pollsters ask questions that put people on the spot, as happens in every poll, many people – especially the people with nonpolitical lives that can’t keep up – will simply make up an “opinion” on the spot.
- We agree: The problem is that those “opinions” are then treated as serious, as in the PPP Bowling Green Massacre poll, where the pollster treated the results as though they accurately reflected the views of the public – and thus made the American public look dumb.
- We agree: Such an interpretation is not a realistic view of public opinion. The respondents were “set up,” given information and pressured to take an immediate position either in favor or opposed, when they had no chance to reflect on what the question meant.
And yet, the latter description is exactly what HuffPollster polls do as well. They don’t give false information, as did PPP, but they do give information without asking respondents if they have ever heard of the matter and then demand an immediate answer. Many people, of course, can give reflective views, but – as Jackson has noted – many people simply haven’t been paying close attention to the issues to give anything more than top-of-mind responses.
Jackson was critical of PPP for not finding out how many people had even heard of the Bowling Green Massacre, and whether they knew it was fake or not.
“In that frame of mind, how many Americans have probably heard of the Bowling Green Massacre, or know it’s a fake thing Kellyanne Conway made up last week? We don’t know. The poll didn’t ask that question.”
But in HuffPollster’s own polls, they make the same mistake: HuffPollster doesn’t ask respondents if they’ve ever heard of the issue being analyzed.
In HuffPollster’s most recent poll on the immigration ban, here is the very first question:
“President Trump recently signed an executive order banning travel for people from seven Muslim-majority countries – Iran, Iraq, Syria, Sudan, Libya, Yemen and Somalia – for 90 days, and suspending the admission of refugees for 120 days. Do you approve or disapprove of this ban?”
Note, there is no question that asks people if they have even heard of the ban.
A follow-up question followed the same format:
“As you may know, President Trump’s travel ban has been temporarily halted by a judge, and a federal appeals court recently refused to reinstate the ban. Do you think the court made the right decision or the court made the wrong decision?”
Again, no question to see if people even knew the travel ban had been halted and a federal appeals court had refused to reinstate the ban – yet, immediately upon providing the information, the question demanded an “opinion.”
That format produced 90% of Americans with a reportable opinion on the first issue, and 81% on the second question.
If, as Jackson argues, the average American isn’t paying attention to politics, then how could HuffPollster find more than eight in ten Americans with meaningful opinions on the ban and the court reaction to the ban?
Answer: Pressuring respondents to make up an opinion on the spot. Which, Jackson argues, shouldn’t be taken seriously.
The major difference between the PPP poll question and the HuffPollster poll question is that the latter did not provide false information. But, like the PPP poll, the HuffPollster poll put many of its respondents on the spot, pressuring them to come up with opinions “whether they have really thought it out or not.” How many? We don’t know. As Jackson said of the PPP poll, “the poll didn’t ask that question.”
The irony of Jackson’s criticism is that the very complaints she levels against PPP – providing information (whether accurate or false) to respondents and then immediately demanding a superficial assessment – can be leveled against her own polls as well.
She can surely claim that her own polls never provide false information. But that is a minor difference. In both cases, many unengaged respondents are pressured to provide an “opinion” without knowing anything about the issue.
The main consequence of providing false information is that it reveals a largely ignorant public, which would otherwise not be noticed. And it’s that revelation which really troubles Jackson and the public policy polling industry.