[lbo-talk] polling docs on "reform"

Doug Henwood dhenwood at panix.com
Mon Sep 21 09:22:43 PDT 2009


<http://www.nationaljournal.com/njonline/mp_20090918_4062.php>

MYSTERY POLLSTER A Tale Of Two Doctor Polls Pollsters Who Don't Disclose Their Methods Render Their Results Suspect

by Mark Blumenthal Monday, Sept. 21, 2009

Last week saw the release of two new surveys of doctors with very different results. The first, funded by the Robert Wood Johnson Foundation and published in the New England Journal of Medicine, found a large majority (73 percent) favoring some form of a "public option" health care plan "like Medicare," either alone or in combination with private plans. Another, conducted by Investor's Business Daily and the TechnoMetrica Institute of Policy & Politics found almost as many (65 percent) expressing opposition to a "proposed government expansion [health care] plan."

That's a big difference. Were they talking to different kinds of doctors?

Perhaps, but unfortunately, only one of the surveys tells us enough about its methods to help us know for certain. That lack of transparency should weigh heavily in deciding which survey to trust.

The report published in the New England Journal of Medicine included extensive information about its methods. Alex Federman, one of the two medical researchers at the Mount Sinai School of Medicine in New York who conducted the survey, also responded to my query for more information.

The Investor's Business Daily article reports nothing at all about their response rate or the characteristics of the physicians it surveyed.

In contrast, the IBD article fails to disclose some of the basic elements of their survey's methods and, as of publication time, neither IBD nor TIPP has responded to my requests for more information.

How would more details help? Consider some specifics:

• Mode. Both surveys were conducted by mail. Randomly selected doctors received a paper questionnaire that they were asked to fill out and return. In a critique of the IBD survey, blogger Nate Silver attacked that practice as "unusual." He's wrong. While mail is an odd choice for pre-election polls, it is an ideal way to survey doctors.

Why? Doctors are notoriously difficult to reach by telephone in their offices. Moreover, the best list of doctors (more on that below) has mailing addresses for all, but telephone numbers for only some. "Our best practice is to start with mail," said Craig Hill, a vice president at RTI International who has considerable experience managing surveys of physicians. "Time and time again, that seems to be the preferred mode."

• Population and Sample. One of the best reasons why survey researchers rely on mail surveys for doctors is the existence of a quality list: The American Medical Association Physician Masterfile is a list of all U.S. physicians -- including AMA members and non-members alike -- that is comprehensive because it enlists doctors when they enter medical school, when they obtain a license to practice and when they receive additional, mandatory professional certification. In other words, it is a very accurate list.

The Mount Sinai researchers report that they drew a random sample of all physicians in the 50 states but excluded those "in training." The IBD article tells us only that they conducted a survey of "practicing physicians chosen randomly throughout the country." They neither define "practicing" nor explain how they sampled the physicians they interviewed.

• Question Text. The Mount Sinai report includes the verbatim text of how they described the three proposals that they asked physicians to evaluate:

1. Public and Private Options: Provide people under 65 the choice of enrolling in a new public health insurance plan (like Medicare) or in private plans.

2. Private Options Only: Provide people with tax credits or low-income subsidies to buy private insurance coverage (without creating a public plan option).

3. Public Option Only: Eliminate private insurance and cover everyone in a single public plan like Medicare.

When asked to choose "which proposal you most strongly support," 63 percent chose the combination of public and private plans, 27 percent chose the private options only and 10 percent chose the public option only.

The IBD article is unclear about the words used to ask their favor or oppose question. In a graphic, the article says "physicians were asked: Do you support or oppose proposed plan," a bit of ungrammatical text that suggests it was compressed from something longer. The article itself reports that 65 percent "say they oppose the proposed government expansion plan." Was that how they described it? Who knows?

• Response Rate and Bias. The Mount Sinai survey does something very few pre-election surveys do: It discloses its response rate (43.2 percent). While that level may seem low, the authors note that it is "typical of most recent national physician surveys" published in medical journals. It is also much higher than the rates reported by media pollsters for their pre-election surveys last fall.

The Mount Sinai report includes something else that is standard for medical journals but rare for published media polls: It includes a table showing key demographic characteristics of both the respondents and non-respondents. Since the AMA Masterfile includes data on physician specialty, type of practice and geographic location, researchers can check for potential indicators of bias. In this case they found no significant differences between those who responded to the survey and those who did not.

In contrast, the IBD article reports nothing at all about their response rate or the characteristics of the physicians it surveyed.

So while we may not have a solution to the mystery of why these surveys produced such divergent results, we can see a huge contrast between the transparency of the Mount Sinai study and the opacity of the poll published by IBD. Yes, the New England Journal of Medicine sets a much higher bar for disclosure than a newspaper, but the IBD poll fails to measure up even to accepted standards for media polling.

I sent a request last week to both IBD and their pollster asking for the exact wording of published questions, a description of the population sampled, the sampling method, the dates of interviewing and the response rate. Both the National Council on Public Polls and the American Association for Public Opinion Research mandate disclosure of the first four in all public reports. AAPOR also requires public disclosure of response rates, while NCPP says pollsters should provide them on request. So far, the IBD/TIPP poll fails on all counts.

So should we trust their poll? I say no.



More information about the lbo-talk mailing list