The story is familiar. A political partisan reacts to a survey result he or she does not like by trashing it, alleging bias or incompetence by the pollster. In doing so, the partisan, either by ignorance or design, thoroughly misrepresents the pollster's methodology. That sort of partisan attack happens somewhere in the blogosphere almost every day during campaign season. What is unusual, however, is when one pollster trashes another this way. Strangely enough, that's exactly what happened last week when a pollster named Matt Towery let loose against a survey conducted by CBS News.
None of the national polls have shown a statistically significant decline in Obama's support since the Wright speech.
On March 20, CBS News pollsters conducted a survey about Barack Obama's speech on race and the Rev. Jeremiah Wright, his controversial pastor. They reported that 70 percent of voters who had heard or read about Obama's speech said it would make no difference to their vote, while 14 percent said it would make them less likely to support Obama and 14 percent said it would make them more likely to support him.
Towery's organization, InsiderAdvantage, also conducted a survey, on March 19, to gauge reaction to the Obama speech. It found 52 percent of those who were aware of the speech saying it would make them less likely to support Obama, 19 percent saying it would make them more likely to support him and 27 percent choosing the category "about the same."
The surveys differed in several important ways -- including the wording of the questions, as well as the fact that CBS used live interviewers while InsiderAdvantage used automated methodology -- but in his syndicated column, Towery seized on the difference in sampling. CBS had recontacted 548 registered voters they had interviewed the previous week, while InsiderAdvantage called a fresh sample of 1,051 adults, all in one night.
Towery called the CBS approach "the most curious polling methodology I've ever come across" and slammed it as "the single most biased and dishonest public opinion survey I've ever seen."
His memory may be short. Pollsters have been recontacting respondents for follow-up interviews since the 1920s, when market researchers found it easier to create "panels" of expert respondents (the term derived from the concept of a "jury panel") that could provide feedback on consumer preferences. In the 1930s, the renowned sociologist Paul Lazarsfeld pioneered the use of the "panel study" to measure changing opinions, and survey researchers have been using it for that purpose ever since.
Among other applications, representative panel surveys have been used as part of the American National Election Studies (ANES) conducted by the Institute for Social Research at the University of Michigan starting in the 1940s. If you have taken an introductory political science course on American elections over the last 40 years, odds are good you have been exposed to data collected from the ANES panel studies.
Towery claims that CBS did not "randomly phone registered voters," but rather "re-polled" a "group of people" they had interviewed previously "about Wright and his views."
That is not quite right. CBS originally interviewed a random sample of 1,067 adults nationwide over four nights prior to the Obama speech (March 15-18). As per their usual practice, they weighted the adult sample on demographic characteristics such as gender, age and race to match U.S. Census estimates. The survey included at least 40 substantive items, including two questions about Obama and Wright on the second night of calling. They asked if respondents had "heard or read about the controversy over statements made by Dr. Jeremiah Wright, who has been Barack Obama's minister," and if "Wright's statements affected [their] opinion of Obama," making them "feel more favorable or less favorable" toward him.
Following the speech, CBS redialed their respondents and were able to interview 542 registered voters for a second time, asking about Obama's speech and reactions to it. The pollsters weighted the data from the second wave so that the respondent demographics matched those from the first. They also weighted on Obama's favorable rating from the first survey to correct any statistical bias that might result if Obama fans or detractors were more or less likely to respond the second time around.
This procedure gave CBS News a particularly strong ability to measure not just overall change in opinion, but change among individuals. CBS director of surveys Kathy Frankovic devoted her online column to the subject, reviewing data showing much shifting among individuals in their perceptions of Obama. Some grew more favorable, some less favorable, producing "no sizeable overall change."
Towery is right to question whether the experience of being interviewed the first time about Obama and Wright and a host of other issues may have made those 542 respondents more attentive to politics in the week of Obama's speech. Republican pollster David Hill raised similar issues in a critique of panel surveys four years ago.
In this case, however, the CBS release was careful to emphasize results among those who had seen or heard about the speech, and CBS made no attempt to use this survey to estimate overall vote preferences for registered voters.
The best test of Towery's critique may be how the results from the two instant reaction surveys compared to other "fresh" samples of voters conducted since the speech. None of the national polls -- including the automated Rasmussen Reports surveys -- have shown a statistically significant decline in Obama's support since the Wright speech.
"Unbelievable," Towery wrote about the CBS survey. "Had my firm employed these types of polling tactics, pundits and alleged 'polling experts' would have torn us to pieces."
Unbelievable, indeed. Just last November, Towery himself conducted a recontact study to gauge reaction to a debate. And a certain polling "expert" called it "innovative." Go figure.
-- Mark Blumenthal is editor and publisher of Pollster.com. His e-mail address is email@example.com.