On PBS’s News Hour last Thursday, Judy Woodruff interviewed Dr. Thomas Frieden, Director of the Centers for Disease Control, about the government’s efforts to rid artificial trans fats from the country’s food system.
From the tone of the questions, it was clear the program wanted to stoke the fires in what USA Today refers to as the “nanny state debate.”
Why, Woodruff wanted to know, was the government taking any steps to rid the food system of trans fats, since the industry itself was already moving in that direction?
Would the government’s efforts mean that products like microwave popcorn, which she admitted she likes a lot, “are just going to go away?”
And how does Frieden answer critics who say “the government shouldn’t be meddling in what people eat?”
In support of the last question, she cited a new Pew Research poll that shows a majority of people oppose “prohibiting restaurants from using trans fats in foods.”
Ah…the polls, ultimate arbiters of what the public thinks. Except when the polls themselves are sloppy.
Did the people in the Pew sample even know what the question meant? How many people do you think know the difference between fats and trans fats? It’s understandable that a majority of people wouldn’t want to eliminate “fats” from restaurants, but trans fats? As the USA Today article notes,
“Hardly anyone defends the icky-sounding artificial ingredient anymore, two decades after health activists began warning Americans that it was clogging their arteries and causing heart attacks….”
“Mostly, Americans’ palates have moved on, and so have their arguments over what’s sensible health policy and what amounts to a ‘nanny state’ run amok.”
So, how was a poll able to show so many people defending trans fats?
Pew did not ask whether their sample respondents knew what trans fats were, nor did Pew provide an explicit “don’t know” option. Instead it used a forced-choice format:
“As I read some policies that have been considered by some cities and states round the country, please tell me whether you would favor or oppose each:”
“Prohibiting restaurants from using trans fats in foods
4 Don’t know (Volunteered)”
Research shows that when respondents are not given an explicit choice of “don’t know,” most respondents will feel obligated to come up with an answer. And in this case, many respondents may easily have assumed the question was about fats, not knowing the difference between fats and trans fats.
You May Also Like...
Before asking this question, Pew should have tried to measure how many people even know there is a difference between the two ingredients. And they should have tried to measure how many people are aware of the harmful effects of trans fats.
Polls can provide important insights into what the public is thinking, but not if the pollsters take short cuts. Pew’s approach on this issue was a superficial, and ultimately misleading, effort to report on public opinion – designed not to understand the public, but rather to provide any ol’ measure that could be used in a news story, regardless of its validity.
I like the response that Frieden gave to Woodruff, when she presented the results of the Pew poll:
“You know, I think it has to do really with often how you ask questions. If you ask people, do you want to have a substance in your food that may kill you that’s artificial and you didn’t know was there, I think you would get a very different response.”
UPDATE: 11/13/2013 6:45 PM EST: Pew Research Center communications manager Russ Oates sent iMediaEthics the response below to David Moore’s blogpost, above. Moore plans to respond this week to the issues raised. The letter below is from Pew Research Center for the People & the Press Director Michael Dimock:
In his Nov. 12 column, David Moore raises a legitimate point – one major challenge to measuring public opinion is that people may not always know what you are asking about. This applies beyond specific public policy questions such as banning trans fats, which the Pew Research Center published last week.
Moore suggests that pollsters always include a series of knowledge questions to first determine the public’s readiness to evaluate the public policy at hand. This approach has some appeal, yet it is both impractical and can lead to serious biases. Practically speaking, any given survey asks respondents about multiple people, issues and policies that they may or may not be familiar with – preceding each with a knowledge test would put a substantial burden on the interview and the person being surveyed. It could also lead to a skew in the balance of opinion being reported. It is not uncommon for an issue to garner far more attention from one side of the debate than the other. For example, if we presented views of Ted Cruz only among people who could prove their knowledge about him, it could produce much more one-sided assessments.
Moore’s second suggestion, that every polling question explicitly offer the option of saying “I have no opinion,” is more practical from a survey administration perspective, but also poses challenges to the interpretation of survey results. Respondents were able to volunteer “no opinion” to our specific trans fat question, but our long experience indicates that offering “don’t know” as an option may prompt some people to opt out of expressing opinions they actually have. And this effect may not be evenly distributed – there are large differences in the propensity to say “don’t know” by age, gender and other factors, which can introduce different biases to the survey results.
Public opinion polling has long struggled with such tradeoffs, and there is no single, right approach. A question about trans fats that included an explicit “no opinion” option would almost certainly have found many more Americans choosing that option. But this might produce a less accurate snapshot of opinion on this public health issue for the reasons discussed above.
The Pew Research Center takes seriously its mission to accurately measure public opinion. We constantly assess our methods and consider the issues I have outlined here as well as many others whenever we go into the field with a new survey.
Pew Research Center for the People & the Press