[lbo-talk] race, polls, and lying

Doug Henwood dhenwood at panix.com
Sat Aug 2 06:17:20 PDT 2008


Wall Street Journal - August 2, 2008

When Voters Lie It's a given that people fib in surveys, and this election season is especially tricky with race looming as an issue. How pollsters are trying to uncover the truth. By ELLEN GAMERMAN

Please respond to the following statement:

People lie on polls:

Strongly agree Agree somewhat Neither agree nor disagree Disagree somewhat Strongly disagree

One of the toughest questions on any poll is whether people are telling the truth. It is a conundrum that looms front and center as voters look ahead to the first U.S. presidential contest that an African-American candidate has a chance to win. With polls showing overwhelming voter support for the idea of a black president, researchers and pollsters are trying to determine who really means it.

Peter Hart, a Democrat on a bipartisan team conducting the Wall Street Journal/NBC News poll, estimates that 10% of current Democrats and independents who say they support presumed Democratic Party nominee Barack Obama may not be giving a fully honest answer, at least based on their responses to broader questions about race. "This election is exceptionally tricky," he says.

While most political pollsters say they don't find large numbers of people lying on polls, they are taking extra precautions. At CBS, pollster Kathleen A. Frankovic says she will ask voters whether they think most people they know would vote for a black candidate -- an indirect way to fish for racial bias. John Zogby, president of the polling firm Zogby International, is asking white respondents whether they have ever been to a dinner party where a black person was present. It only takes a handful of people hiding their true opinion to skew poll results, he says: "A small number can loom large."

At ABC News, polling director Gary Langer says the network is noting the race of the phone interviewer for the first time in its presidential polls. The idea is to see whether the questioner's race could have an effect on responses by voters. (Though respondents don't know the race of the interviewer, they might try to guess based on the interviewer's voice.) All the responses are combined to create a big picture that can show if, for example, white voters tend to tell white interviewers one thing and African-American interviewers another. So far, Mr. Langer says, the race of the interviewer doesn't appear to have an effect.

Democratic pollster Mark Mellman is going one step further. He's having his interviewers ask at the end of the survey, "What race do you think I am?" He says assumptions about an interviewer's race can have an impact, and he wants to develop corrections for that in some of his surveys.

Pollsters look for the "Bradley Effect," the idea that some white voters are reluctant to say they support a white candidate over a black candidate. The phrase refers to California's 1982 gubernatorial election, when the late Tom Bradley, a black Democratic mayor of Los Angeles, led in exit polls against white Republican George Deukmejian. Mr. Bradley lost the election. The conclusion: some voters hid their true choice from pollsters. Skeptics say the issue was neither race nor honesty. One theory is that Mr. Deukmejian's supporters simply didn't want to participate in polls.

Sen. Obama leads Republican rival John McCain 47% to 41%, according to a Wall Street Journal/NBC News survey last month. Aides to Sen. Obama and Sen. McCain declined to discuss the details of the candidates' polling strategies. But the two camps no doubt take surveys with a grain of salt. "There certainly is a presumption that people self- censor to some degree," says Scott Keeter, director of survey research at the Pew Research Center in Washington.

In a recently released study, Sacred Heart University in Fairfield, Conn., found nearly 11% of people who have reported being polled said they have lied to pollsters about their views on politics and public affairs. "Why they're lying is probably as varied as individuals are varied," says Jerry Lindsley, director of the school's polling institute. "Halfway through a survey, they might all of a sudden get nervous about the kinds of questions they're being asked and start to lie or not be totally straightforward."

Questions about polling and race were raised during this year's presidential primaries. In New Hampshire, polls gave Sen. Obama as much as a 10-percentage-point advantage over Hillary Clinton the day before the primary. Sen. Clinton went on to win the state. Pollster Andrew Kohut, president of the Pew Research Center, doesn't blame lying. Instead, he says, some voters who were poorer, less-educated and white may have had less favorable views of African-Americans and were less likely to take surveys. "When polls get it wrong, it's not because people lied, it's because the people who turned down the polls have different attitudes than the people who took the polls," he says.

It may turn out that hidden prejudices don't significantly affect the outcome of this election. Even so, researchers working on the National Election Studies are drafting extra questions to spot racial bias. The federally funded election studies have been conducted every four years since pollsters got the Dewey-Truman presidential race wrong in 1948. For the November election, researchers are preparing a survey with about 50% more questions on race than in 2004, says Arthur Lupia, a University of Michigan professor and one of the studies' chief investigators. Many of the questions are aimed at gauging the so- called social desirability bias -- a widely studied phenomenon in which people change their answers because they are too embarrassed to say how they really feel.

"For us not to go for it and ask, 'Look, is it race or not?' there'd be massive disappointment in our study," says Dr. Lupia. He plans to include a question asking whether a white or black candidate would be better suited to deal with foreign-policy problems. The answers, matched up with information on respondents' race, whether they voted and whom they reported voting for, can help analysts determine the extent to which people's feelings about race affected their votes in the presidential election -- even if they say race wasn't a factor.

His colleagues at the University of Michigan Survey Research Center also are studying survey-takers. In a cluttered fourth-floor office, Fred Conrad, an associate professor of survey methodology, is trying to use online polling to elicit honest responses. He and researchers at the New School for Social Research in New York recently completed a study using a redheaded computer avatar named Victoria. She comes in two varieties: one who moves, blinks, smiles and even smirks, and a low-tech version who mostly stares blankly. The responses from people who take surveys with the two Victorias are compared with responses given to human interviewers, as well as audio surveys given online.

Dr. Conrad says that in many cases the more human Victoria looked, the more people lied. When she asked about their TV-watching habits, her pouty lips moving in sync with her questions, more people told her they only watched prime-time TV several times a month compared with online survey-takers, who were more likely to report watching several times a week. They also told Victoria they were thinner than they told the human interviewers, who presumably could see who was lying.

Dr. Conrad, a boyish 52-year-old, says the results may have been more dramatic because of Victoria's looks. "She's a little sexy," he says. Victoria's outfit was more revealing than a typical interviewer's, with her black top scooping down to reveal her collarbones. Dr. Conrad says he had little choice. The software he used offered only one other female avatar, Kim, whose eyes "looked like they might fire yellow laser beams at you and destroy you," he says. No matter which avatar he picked, he says, his or her gestures and looks would affect the outcome of the survey. That's the point. "To us, the lesson is, everything matters," he says.

A few doors down, Roger Tourangeau looks at ways people change their responses without always realizing it. The researcher in survey methodology was part of a University of Michigan and University of Maryland team studying what he calls "Good Is Up." If a word is listed at the top of a computer screen, more people are likely to assume it is positive, especially if they don't know its definition. In one test, he put "riboflavin" on a list of nutrients. When it was at the top of the screen, more people said it was good for them. When it was lower down, it was identified as less healthful.

The research is helping refine polling at a university phone center nearby. Activity at the center, which sits in a former school building, picks up around dinnertime when the staff makes calls for university-run surveys from a warren of cubicles. The questioners are asked to speak in even tones, reading from scripts. No one is allowed to say, "How are you?" in case the person on the other end had a bad day. The interviewers don't laugh; they don't want people to treat this as a social call. They are allowed only neutral responses such as "I see" or "Hmm."

Many polls offer anonymity, but now there is evidence that also distorts results. Stanford University professor Jon Krosnick and researchers from the University of Colorado studied people who were offered M&M's and then asked to fill out surveys. Some respondents were asked to include their names, others remained anonymous. When survey-takers were asked how many candies they had eaten, almost everyone said they had eaten fewer than they actually did. But the anonymous people said they had taken the least M&M's. "This is the drawback," Dr. Krosnick says. "They're not held accountable for being accurate."

People fib about such touchy subjects as how much they weigh, how often they have sex, how much money they save, how many vegetables they eat and how often they go to church. Finding out why has spawned reams of academic papers. People may give inaccurate answers and not realize it, one reason why many experts are reluctant to call this "lying." But researchers also know that people hide the truth to avoid embarrassment, or they shape their answers to please interviewers. Other survey participants may not know what to say, so they make up an answer.

In a study last year, the rate of abortions performed by health-care providers was found to be twice that reported by women in face-to-face interviews. The Guttmacher Institute, a nonprofit organization that researches reproductive health issues, looked at the responses women gave in the 2002 National Survey of Family Growth and compared that with data reported by abortion providers.

Liz Ward, a 45-year-old public-relations consultant in Manhattan, recently found herself fibbing about her views on abortion. Near the end of a phone survey about a state reproductive-rights bill, she indicated that she supported late-term abortions and allowing teenage girls to receive abortions without their parents' knowledge. In fact, she objects to both. She says she answered yes because she supports abortion rights, generally. Although she didn't believe the bill would promote either scenario, she still felt uncomfortable after she hung up. "I guess I lied," she says.

Julio Vasconcellos, 27, a vice president at the social-networking site Experience Project, says he reported on a survey for his San Francisco gym that he jogs three times a week, even though he runs just twice a week. He says he didn't think about the lie as it was happening. "I sort of put down my goal," he says. "You definitely want to make things look a little better."

---

HOW THE UNCONSCIOUS AFFECTS THE TRUTH

Pollsters try to get voters to reveal the biases they're too embarrassed or afraid to admit by asking questions like, "Is the country ready to elect an African-American president?" But people also have biases they don't know they have. These implicit biases, as psychologists call them, are picked up over a lifetime, absorbed from our culture, and work automatically to color our perceptions and influence our choices.

A massive study called Project Implicit uses a simple online test to attempt to measure the pervasiveness of dozens of implicit social biases, including those based on race, gender, sexual orientation, ethnicity, weight, age and religion. The project, housed jointly at the University of Virginia, Harvard University and the University of Washington, collects 20,000 responses a week -- and hundreds of researchers are using its data to predict how people will behave based on their unconscious prejudices.

The findings from Project Implicit's six million participants over a decade of testing reveal lingering suspicion of minority groups: Some 75% of whites, Hispanics and Asians show a bias for whites over African-Americans. Two-thirds of all respondents feel better toward heterosexuals than gays, Jews than Muslims and thin people over the obese. Minorities appear to carry some of the same biases. As many African-Americans show a preference for whites as for blacks. A third of Arab Muslims show a bias in favor of non-Muslims, and more than a third of gays prefer straight people. The strongest biases are against the elderly. More than 80% of test-takers showed a bias for the young, and that included respondents older than 60.

Project Implicit -- which is funded in part by the National Institute of Mental Health and the National Science Foundation -- studies how people associate a group of people, shown in photographs, with either positive or negative words. (Demonstrations and registration for the full tests are available online at implicit.harvard.edu.)

Most of the research around its data is academic. One current project aims to see whether liberals or conservatives are more enthusiastic about the future or more nostalgic about the past. Practical applications are starting to evolve too. Clinical psychologists are studying whether implicit biases affect how doctors care for their patients.

Bias against African-Americans and the elderly will likely play a role in November's presidential election. Presumed Democratic Party nominee Barack Obama's father was black, and presumed Republican Party nominee John McCain is about to turn 72 years old. But researchers say they do not know to what degree bias will play out among voters. "We are not slaves to our associations," says University of Virginia psychologist Brian Nosek, one of Project Implicit's founders and principal researchers. A focus on Sen. Obama's promise of change, for example, could lead voters to forget his race; Sen. McCain's war record could let voters forget his age. Overriding bias requires a concerted effort, Mr. Nosek says. Most people don't see their own implicit bias, which can appear spontaneously as intuition, a gut feeling or a vague doubt about a candidate.

In the anonymity of the voting booth, those feelings could have a significant effect on undecided voters, says Yale University psychologist John Dovidio. With as many as 15% of undecided voters up for grabs, implicit bias could have a big effect in the presidential election.

University of Washington psychologist Anthony Greenwald, who developed Project Implicit's online test, predicts that Sen. McCain will get more votes than the polls currently predict in states with small black populations and that Sen. Obama will get more votes than polls predict in states with large African-American populations. The reason: whites unconsciously understate their pro-white bias by telling pollsters they will vote for Sen. Obama, while blacks unconsciously understate their pro-black preference by saying they don't intend to vote.

"There may be more in us that anticipates what we are going to do than we can report to others or ourselves," says Mr. Nosek. We aren't lying, he says, but we also may not be telling the truth.

--June Kronholz



More information about the lbo-talk mailing list