There are two main types of state pollsters: Public pollsters who are usually sponsored by news channels and university pollsters, who have one key advantage in the form of cheap labour, students.
Most multi-state public pollsters try and cut costs by using automated calling service (i.e. without a live interviewer), called Interactive Voice Response (IVR). While some pollsters object to this, in the 2004 they performed better than the more traditional methods as shown here, and fivethirtyeight has two of the largest IVR pollsters, SUSA and RasmussenReports, very near the top of his pollster ratings. Nate Silver at 538, ranks pollsters in order of their perfomance in past elections compared to their competitors.(Actually it's a little bit more complicated than that, here is full description with the warning that it becomes quite technical.)
For a more detailed analysis, this article lays outs the benefits and possible problems of IVR. There are also campaign pollsters who occasionally release their results, usually when they just complete an especially good poll for their candidate, or a bad set of public polls come out against their candidate.
Here is a quick rundown of the main public and university pollsters:
Rasmussen Reports: Rasmussen does the most amount of state polling of all the pollsters and despite that volume is ranked an impressive 3rd on 538. Like his national tracker Rasmussen does weigh his state poll by party-id. He does this by adjusting the party weighting given in 2004 and 2006 state exit polls, by the difference between the 2004 and 2006 national exit polls and his current numbers for his tracker. He also keeps track of his previous polls from the same state and may do some fine tuning if the numbers do not match.
His sample sizes can be quite small at 500LV (although his weekly 5 battleground polls which are sponsored by fox are a more respectable 1000LVs), this combined with the large number of poll he commissions is bound to lead to some outliers. For example, last month his New Mexico poll had McCain up by 2 or in July his Ohio poll had McCain up by ten, but these are relatively rare.
Compared to the average of all pollsters Rasmussen tends to lean a couple of points towards McCain. That’s not to say Rasmussen is necessarily biased or wrong (indeed if 2004 is anything to go by then he is more likely to be right) but just relative to the average of all pollsters, his poll tend to lean slightly to McCain. This is not a uniform leaning across all the states, so while rasmussen’s polls are near the average in Florida and North Carolina, they tend to lean McCain in Virginia and Ohio. Despite this Rasmussen polls are certainly important polls to keep an eye on.
SurveyUSA(SUSA): Does the second most state polls, but unlike Rasmussen does not weigh for party-id. After weighing for the usual demographic factors, (gender, age, race, geographic location), SUSA just lets the party-id fall where the sample happens to come up. Much like in the national polls this leads to a somewhat more variable poll but prevents one source of bias entering. SUSA was the most accurate pollster during the primary season, often and (rather eerily) predicting the exact result. However the same caveats apply that applied to Rasmussen applies to SUSA, that any pollster which does a large number of polls with a relatively small sample size will have some outliers such as this Minnesota poll. SUSA deserves much credit for releasing arguably the most detailed internals of all pollsters.
Insider Advantage: This pollster is run by an ex-republican Georgian congressman Matt Towery. I/A polls tend to have small samples and do weigh for party-id. While there does not seem to be a systematic bias towards either candidate, their poll's internals can be very erratic. Throughout the primary season they substantially underestimated Obama support among african-american voters, (to some extent all pollsters did this, but I/A were the worst) and have continued in the general election.
(Note: As of late I/A have stopped releasing their internals)
Strategic Vision (SV): This is a pollster which is associated with the republican party. It does not release details of their internals or for that matter their methodology. However it does use relatively large sample sizes, usually 1000+. Their poll tends to lean to the republicans by a couple of points from the average pollster.
Public Policy Polling (PPP): This pollster is associated with the democratic party but unlike SV, does release all their internals and methodology. PPP subscribes to a system known as Aristotle which provides a list of registered voters with key demographics (Gender, age, race, location, party-id) so they can create a proportionate voter pool of what they predict the electorate will be on election night, and hence can then randomly pick from this pool for their poll sample. It has a relatively large sample usually 1000+LVs. Intriguingly, PPP runs a blog which allows for Tom Jenson (the communication director of PPP) to explain his methods and interesting findings. The blog also allows for those interested to post questions, comments and very recently, once per a week, to actually vote on where PPP should poll next. PPP polls do not weigh for party-id and tend to lean to the democrats by a couple of points from the average of pollsters.
Quinnipiac: A generally well-respected university pollster. It has large sample sizes of over 1000LVs and does not weigh for age. It tends to lean to the democrats by two or three points.
Research 2000 (R2000): An independent pollster which like SUSA releases almost complete internals to their polls. It tends to be sponsored by relatively small companies (small-ish tv stations or quite often daily kos) and hence usually has relatively small polls (around 600). It does weigh it polls by party-id and it tends to lean to the democrats by a couple of points.
Mason-Dixon: A well-regarded independent pollster who much like R2000 tends to be sponsored by small companies. It has a small sample size and weighs it’s polls by party id. It tends to lean to republicans by a couple of points.
Selzer: The oracle of Iowa. Selzer has the highest ranking on 538. Only polls Iowa and some of the nearby states (Michigan and Indiana). Selzer has an interesting method for dealing with the expected higher turnout of the youth in 2008. She records the response rate and likelihood of adults, of varies ages, of actually voting, and then multiples the prospensity to vote of these age groups by the demographics of the state. (Rather than just looking at the results of previous election.) Due to this method, she is predicting a younger electorate and hence tends to lean to the democrats by a couple of points.
CNN/TIME/ORC: Unfortunately release very little about their polls. They have moderate sample size (about 700LVs) and tend to lean a couple of points to the democrats. (The polling company ORC is also commissioned by fox to do their national poll)
Marist: Another respected university poll. It has moderate sample sample (about 800LVs). Marist does weigh by party-id. Strangely while Marist demographic weights would imply a republican leaning, the outcome leans to democrats by several points.
Big Ten: Universities of ten battleground states came together in order to provide what was meant to be a good look at the upcoming election, however the results were mixed with some odd results such as Iowa poll showing the race tied while all other poll have had Obama with a double figure leads. There were other oddities among the internals of these polls. Hopefully they will be able to work past these teething problems.
American Research Group (ARG): An independent pollster, which had a torrid primary season often missing the actual results by more than ten or even twenty points. Even the normally innocuous Mark Blumenthal felt the need to repudiate ARG. Moreover the editorial comments bring into question whether they are competent (or perhaps honest) enough to be in the industry given such absurd assertions as their explanation that the reason why their polls leaned to Clinton during the primary by 4-5 points was due to undecideds going uniformly to Obama, despite the exit polls showing that if anything they tended towards Clinton. Do not trust these polls.
Zogby Interactive: The internet poll, which gives all other internet polls a bad name. Whether it is zogby’s weighting or some problem with the methodology, these polls are more erratic than a drunk dart’s player. Sometimes they hit the board but are of extreme danger to those who analysis the state of the race by taking averages of all available polls. Some of the more absurd results were Obama leading by 3 points in Arizona and giving Barr 15% in New Hampshire. They have one of the lowest ratings on 538.
Friday, October 17, 2008
Subscribe to:
Post Comments (Atom)
3 comments:
...what about Gallup?
Finally! Thank you John for a lucid explanation of the value of each pollster. Maybe now I will understand you and Marcus through your lively political jargon...
Christina, thank you kindly for your (far too) nice words. To be honest I thought any article on pollsters would be too boring to read, espically mine, so it's a relief that someone, (other than those like Marcus with strange political fetishes), would get some utility out it.
Martin, rather surprisingly Gallup has not done any state polls so far this year. In 2004, they did do some state polls in conjunction with CNN and USAT, but had the dubious honour of being the only pollster who managed to call Ohio, Florida and Pennyslavinia the wrong way. Perhap not uncoincidentally, CNN are using ORC this year. However the Gallup national tracker was much more reasonable predicting a 2-point win for Bush. (The result was a 3-point win)
Post a Comment