Last week I was struck by a comment made by FiveThrityEight.com’s (538) Editor-In-Chief Nate Silver about Alaska polling. He said, “”I’m not sure why Alaska is close, and the polls there are kind of crap.”
My first instinct, as it usually is, was to stick up for my fine Alaska brethren and say “hey, Alaska polling is just as good as your east coast elite’s polling.” After all, any political professional who has worked anywhere has heard the work of every single pollster in the universe disparaged when it isn’t to the viewer’s liking.
Unfortunately, Silver’s opinion carries some weight here. First of all, he likely has no particular interest in the outcomes of races in Alaska, so he would have little to no bias beyond wanting to get accurate data for his national models. Second, 538’s sole reason for existing is to evaluate polling and election data. If there is anyone in America who qualifies as an independent expert on which pollsters are good and which are bad, it is Silver.
All of us who have done political work in the Last Frontier have heard all of the cautions about the quirkiness of Alaska polling. Alaska’s small population (particularly the small size of our state house districts), the transience of our people and the difficulty in accurately sampling rural populations are just a few of the many problems I’ve heard as reasons Alaska polling is questionable.
How do the pollsters who publish most of the local data we see stack up? Which ones should we put the most stock in?
Luckily for us, 538 doesn’t just talk trash, they actually dig in and evaluate each one. We put together this sampling of their evaluations for pollsters we have seen operating in Alaska recently:
Based on the overall grades given by 538, Lake Research Partners, Dittman Research, Strategies 360, and Moore Information (Not Ivan Moore) are the best of the bunch in Alaska. They all scored a B or B- grade. Alaska Survey Research (Re-branded from Ivan Moore Research) comes in with the lowest grade, a C.
Just so we are all on the same page, here is the criteria for a poll to be evaluated by 538:
“Polls a firm conducted in the final three weeks of U.S. House, U.S. Senate, gubernatorial and presidential general election campaigns since 1998, and the last three weeks before presidential primaries and caucuses from 2000 through June 7, 2016.”
538 also lists several factors in how they evaluate pollsters. The easiest for most of us to understand is the percentage of races in which a pollster has accurately called the winner. By that metric, if our statistical skills are what we think they are, Dittman Research has a truly impressive 100% track record. If they say you are going to win, you probably are.
Also of interest to those of us who are polling data geeks, and something that separates 538 from a less sophisticated poll aggregation site like Real Clear Politics, is their “media reverted bias” rating. It tells you which way a pollster’s numbers typically skew, to the Rs or Ds. 538 defines it as:
“A pollster’s historical average statistical bias toward Democratic or Republican candidates, reverted to a mean of zero based on the number of polls in the database.”
This is handy in telling us if you are seeing a Dem-friendly poll or if it has a GOP bend to it.
Interestingly, it would appear pollsters Moore Information and Strategies 360, who do a lot of work in Alaska for Democrats or left-leaning interest groups, actually have a slight Republican-friendly bias. Everyone else, including those with more GOP clients like Dittman Research and Hellenthal and Associates, have varying levels of a Democrat-leaning bias in their numbers.
So there we are, now we know who’s numbers we can trust and by how much. Or maybe we don’t know anything at all.