Is the expansion of Medicaid taking a turn for the worse?
Late last month, Washington D.C. public radio WAMU reported that a poll of 800 Virginia residents made just that argument.
The poll, conducted by Christopher Newport University’s Judy Ford Wason Center for Public Policy, found a majority opposed to expansion, 53% to 42%. Two months earlier, in February, the same poll reported majority support, 56% to 38%. That represents an astounding 29-point swing in public opinion — from an 18-point margin in favor to an 11-point margin opposed.
With Virginia Governor Terry McAuliffe trying to get the Republican legislature to accept Medicaid expansion as provided in the Affordable Care Act, these results seemed to show a GOP advantage. As WAMU’s Michael Pope wrote: “The shift in the public’s views suggests that Republican lawmakers are winning the long-running public relations battle over McAuliffe and Democratic lawmakers.”
Before looking into the details of the poll, we should immediately recognize that opinion changes of that magnitude tend to take years to happen, absent some immediate crisis or national event dominating the news media. That it supposedly occurred in a two-month period should immediately raise our skeptical attennae.
And indeed, there is reason to doubt the alleged shift in opinion. As Geoff Garin, McAuliffe’s pollster during the gubernatorial campaign, explained to Slate’s David Weigel, the “result is totally driven by the way the question is framed and the information in the question, rather than a reflection of what voters might be hearing about the issue on their own.”
What the Polls Asked and the Information They Provided
As Garin implies, the respondents in the two polls were provided what can only be described as tendentious information and biased question wording. Here are two of the February questions with results:
“Medicaid is a health care program for families and individuals with low income that is funded by both federal and state tax dollars. Currently, Virginia is faced with a decision about whether to expand the Medicaid program to cover an additional 400,000 mostly working poor Virginians who are uninsured. In general do you support Medicaid expansion or do you oppose it?”
6 No opinion
“Some people worry that the federal government will not pay its share if Virginia expands Medicaid. Would you support Medicaid expansion even if the federal government did not pay its share and Virginia had to cover that cost, or would you oppose it?”
5 No opinion
Note that the second question presents a GOP criticism for which there is no evidence, and which contradicts the law that was passed by Congress and signed by the president. It indicates that if the money were not available, and Virginia had to pay for the Medicaid expansion all by itself, the public would oppose expansion by a 13-point margin.
Now here are the two April questions with results:
“How closely have you been following the debate in Richmond over the issue of expanding subsidized health insurance coverage for the working poor, would you say…”
20% Very closely
38 Somewhat closely
23 Not closely
20 Not at all
“In that debate, the Democrats propose to subsidize private insurance for 400,000 uninsured and low income Virginians by using federal Medicaid money that would otherwise not come to Virginia. Republicans oppose this expansion because they fear the federal Medicaid money will not come as promised, and also say the current Medicaid program has too much waste and abuse and needs reformed before it is expanded.
“I’d like to know where you stand, would you say that you generally [RANDOMIZE: ‘support using federal Medicaid money to expand health coverage’ or ‘oppose using federal Medicaid money to expand health coverage’]?
41% Support using federal money to expand health coverage
53 Oppose using federal money to expand health coverage
6 Unsure/no opinion
Note what the second poll does. It includes the baseless charge that federal Medicaid money may not be available “as promised,” as though the Affordable Care Act were not the law of the land, but rather some personal “promise” that could easily be broken.
After respondents are told that the money may not be available, they show a net 11-point opposition, reinforcing the first poll that Virginia voters don’t want to fund the expansion without the promised federal funds.
So, do the results of the two polls show a “change” in public opinion over time as claimed? Not at all. They show a consistent opposition to Medicaid expansion without federal funds.
Unlike the first poll, the second poll doesn’t measure voters’ opinions about expansion under the assumption the money is available. But had the pollster asked that question, no doubt the results would have been similar in the second poll to what they were in the first poll.
Clearly, then, there is no evidence that public opinion has changed, despite the pollster’s claim.
What Does “Public Opinion” Mean On An Issue When Few People Know About It?
Probably the first thing to notice about the policy questions in both polls is that they are based on the assumption that most people don’t know much, if anything, about Medicaid or the expansion of Medicaid. That’s why both questions first have to inform the respondents what Medicaid is before asking whether they support or oppose it.
That assumption of public ignorance seems to be largely supported by the introductory question in the April survey. Note that 43% of Virginians say they have not followed the issue of Medicaid expansion at all, while an additional 38% say they have followed it only “somewhat” closely.
Think about that for a moment: About four in five people answering the poll know only a little or nothing about Medicaid expansion. Yet all of them are asked for their opinion of it. And sure enough, in each poll, 94% of the sample expresses an opinion either for or against expansion of Medicaid.
The Importance of Question Wording
If so many people know so little, on what basis are they formulating an opinion? The answer: On the way the question is worded. Most people know only what the pollsters have told them about Medicaid.
In the first poll, respondents were initially told that Medicaid helps the poor, and by a large margin the respondents said that sounded good and should be expanded.
Once they were told, however, that the money might not be available, a majority opposed expansion.
In the second poll, respondents were also told that Medicaid helped the poor, but that Republicans felt it was possible that the Medicaid money would not come as promised, and that the current program may have too much abuse and may need to be reformed before it is expanded.
Well, who could support a program for which there might not be any money and which needed reform anyway? Based on what we saw from the second question in the first poll, the response was predictable: Opinion is negative without federal funds.
No doubt some respondents would have supported Medicaid expansion in both polls, and some would have opposed it in both. But a large segment – the segment that knows virtually nothing about the issue (estimated by the second poll to be at least four in ten citizens, and possibly more) – is highly influenced by what the pollsters tell them.
Christopher Newport University’s Quentin Kidd Responds to Criticism
In response to criticisms he says he’s received (e.g., Slate and HuffPollster), the Director of the Wason Center who oversaw the poll, Dr. Quentin Kidd, defended the poll’s findings and conclusions in a subsequent op-ed piece on PilotOnline. He focuses on the change in question wording between the first and second polls, arguing that because the partisan arguments over the issue had changed during that period, he had to change the question wording.
But that argument misses the point altogether. What Virginia voters tell pollsters is very much a reflection of what the pollsters tell Virginia voters. Yes, some Virginia voters have unshakable views of their own, but most do not.
The difference in poll results between February and April do not reflect the change in the political arguments being made in Richmond, but rather the change in information that the pollster gave to his respondents.
Message Testing Is Not the Same as Measuring Public Opinion
It’s important to reaffirm that basic point: The two polls do not measure public opinion as it existed at the time of the interviews.
Instead, the polls measure a hypothetical public opinion that might possibly have existed if everyone in the Commonwealth of Virginia had been fed exactly the same partisan arguments that the poll fed its respondents. This is a classic case of what partisan pollsters do: They test messages to see which ones seem to work, and which ones do not.
The message testing in this Christopher University poll is clearly one designed to reinforce the Republican position, with its emphasis on the shortcomings of the program and the claim that federal funds might not be available. One could just as easily have designed a message testing poll that reinforced the Democratic Party’s position – focusing, for example, on the number of lives saved by expansion, that the program has worked to help millions of people, that the money is guaranteed by law, and so on.
That’s the beauty of message testing polls – they provide partisan information that suggests how a political party can best frame its side of an issue.
But message testing is not measuring what public opinion actually is today. And measuring today’s public opinion is, presumably, what media and academic polls are intended to do.
How to Measure Public Opinion
Is there an objective way to ask about the issue? Pure objectivity is difficult, but a good start would be to avoid providing respondents any information – either for or against the program.
Why not give the respondents any information? Once a sample of respondents has been “informed” about an issue by a pollster, that sample can no longer be used to represent the general population – because the general population has not been “informed” about the issue in the same way. The sample is now different from the population.
Instead of giving respondents information, the pollster could just ask people if they support or oppose Medicaid expansion, or if they don’t know enough to say.
Then the pollster could follow up by asking how intensely respondents feel about the issue – whether, for example, they would be upset if the opposite happened to what they said they preferred.
That approach, as I’ve argued elsewhere, would give us a more realistic description of what the public is thinking – how many people even have an opinion, and of those who do have one, what its direction is and how strongly people feel about it.
The final results would be a realistic view of public opinion as it actually exists – based on what the people have heard by themselves, and what they think as a result, rather than on what the by the pollsters tell the respondents.
The lesson for poll watchers: When pollsters “inform” their respondents about an issue, Beware! The results will say more about bias in question wording than what the public is really thinking.