Philip J. Trounstine

The Science of Political Polling


Phil Trounstine is Director of the Survey and Policy Research Institute (SPRI),
a self-supporting, non-profit entity located at San Jose State University that
provides consulting services in the areas of survey and policy research,
analysis and report generation to business, government, civic and political
organizations. The Institute provides mastery in sampling technique, instrument
design, quantitative and qualitative research,  professional interviewing, data
analysis, report presentation and multicultural research.

The use of polling to gauge the impact of statements from political candidates
on the electorate has been widespread during this years presidential race, and
accusations of deliberately biased polling have been made by both campaigns.
Phil, the former political editor of the San Jose Mercury News, will give an
overview of the science behind polling, give his sense of the level of bias in
the polls, and comment on how polling results may influence how voters make
their selections on election day.


Phil began his talk by explaining that the SPRI first developed its name by gathering information on California consumer confidence using the same polling techniques that the University of Michigan developed to gather national data on the same subject. Since then they have done a lot of polling for a variety of clients. Part of his goal in basing the institution at San Jose State was to help give the school more of the kind of name it deserves, especially considering that a lot of the worker bees that keep Silicon Valley moving forward come out of that school.

He then quoted a number of different national polls that showed Bush ahead of Kerry in the race to be the next President by one to two percent. Then he explained that there are many trick issues associated with getting good information from people. For example, if you ask someone what Party they associate with and then whom they will vote for, the answer is likely to be very predictable. If you put a number of questions between those, you are more likely to get answers that actually capture
that persons feelings.

Phil then explained that pollsters generally use randomized lists of residential land line phone numbers because studies have shown these are most likely to give answers that are statistically similar to those of election day, within margins of error.  There are companies that provide such lists in blocks of 1000, evenly distributed throughout the area codes and prefixes of the State.

Phil said there are some population shifts that are not captured by the sampling techniques above. For example, young voters are more likely to only have cell phones. To illustrate this he asked if anybody in the room had a cell phone and no land line. One person raised his hand (something less than 5% of the people in the room). Then he told us he had asked the same question in a classroom at San Jose State, and a quarter of the students had raised their hands. He said it was possible that such effects could make poll results different from the outcome of the election in States like Ohio and Florida that have huge populations of college students (who have been shown to be more likely to vote for Kerry).

Tian Harter