[Home]  [Headlines]  [Latest Articles]  [Latest Comments]  [Post]  [Mail]  [Sign-in]  [Setup]  [Help]  [Register] 

Utopian Visionaries Who Won’t Leave People Alone

No - no - no Ain'T going To get away with iT

Pete Buttplug's Butt Plugger Trying to Turn Kids into Faggots

Mark Levin: I'm sick and tired of these attacks

Questioning the Big Bang

James Webb Data Contradicts the Big Bang

Pssst! Don't tell the creationists, but scientists don't have a clue how life began

A fine romance: how humans and chimps just couldn't let go

Early humans had sex with chimps

O’Keefe dons bulletproof vest to extract undercover journalist from NGO camp.

Biblical Contradictions (Alleged)

Catholic Church Praising Lucifer

Raising the Knife

One Of The HARDEST Videos I Had To Make..

Houthi rebels' attack severely damages a Belize-flagged ship in key strait leading to the Red Sea (British Ship)

Chinese Illegal Alien. I'm here for the moneuy

Red Tides Plague Gulf Beaches

Tucker Carlson calls out Nikki Haley, Ben Shapiro, and every other person calling for war:

{Are there 7 Deadly Sins?} I’ve heard people refer to the “7 Deadly Sins,” but I haven’t been able to find that sort of list in Scripture.

Abomination of Desolation | THEORY, BIBLE STUDY

Bible Help

Libertysflame Database Updated

Crush EVERYONE with the Alien Gambit!

Vladimir Putin tells Tucker Carlson US should stop arming Ukraine to end war

Putin hints Moscow and Washington in back-channel talks in revealing Tucker Carlson interview

Trump accuses Fulton County DA Fani Willis of lying in court response to Roman's motion

Mandatory anti-white racism at Disney.

Iceland Volcano Erupts For Third Time In 2 Months, State Of Emergency Declared

Tucker Carlson Interview with Vladamir Putin

How will Ar Mageddon / WW III End?

What on EARTH is going on in Acts 16:11? New Discovery!

2023 Hottest in over 120 Million Years

2024 and beyond in prophecy

Questions

This Speech Just Broke the Internet

This AMAZING Math Formula Will Teach You About God!

The GOSPEL of the ALIENS | Fallen Angels | Giants | Anunnaki

The IMAGE of the BEAST Revealed (REV 13) - WARNING: Not for Everyone

WEF Calls for AI to Replace Voters: ‘Why Do We Need Elections?’

The OCCULT Burger king EXPOSED

PANERA BREAD Antichrist message EXPOSED

The OCCULT Cheesecake Factory EXPOSED

Satanist And Witches Encounter The Cross

History and Beliefs of the Waldensians

Rome’s Persecution of the Bible

Evolutionists, You’ve Been Caught Lying About Fossils

Raw Streets of NYC Migrant Crisis that they don't show on Tv

Meet DarkBERT - AI Model Trained On DARK WEB

[NEW!] Jaw-dropping 666 Discovery Utterly Proves the King James Bible is God's Preserved Word

ALERT!!! THE MOST IMPORTANT INFORMATION WILL SOON BE POSTED HERE


Status: Not Logged In; Sign In

The Water Cooler
See other The Water Cooler Articles

Title: Nate Silver's Best & Worst Polls of 2012: Gallup Did TERRIBLE
Source: 538
URL Source: http://fivethirtyeight.blogs.nytime ... he-2012-presidential-race/#h[]
Published: Nov 10, 2012
Author: 538
Post Date: 2012-11-10 21:51:17 by Brian S
Keywords: None
Views: 535

¶As Americans’ modes of communication change, the techniques that produce the most accurate polls seems to be changing as well. In last Tuesday’s presidential election, a number of polling firms that conduct their surveys online had strong results. Some telephone polls also performed well. But others, especially those that called only landlines only or took other methodological shortcuts, performed poorly and showed a more Republican-leaning electorate than the one that actually turned out.

Our method of evaluating pollsters has typically involved looking at all the polls that a firm conducted over the final three weeks of the campaign, rather than its very last poll alone. The reason for this is that some polling firms may engage in “herding” toward the end of the campaign, changing their methods and assumptions such that their results are more in line with those of other polling firms.

There were roughly two dozen polling firms that issued at least five surveys in the final three weeks of the campaign, counting both state and national polls. (Multiple instances of a tracking poll are counted as separate surveys in my analysis, and only likely voter polls are used.)

For each of these polling firms, I have calculated the average error and the average statistical bias in the margin it reported between President Obama and Mitt Romney, as compared against the actual results nationally or in one state.

For instance, a polling firm that had Mr. Obama ahead by two points in Colorado — a state that Mr. Obama actually won by about five points — would have had a three-point error for that state. It also would have had a three-point statistical bias toward Republicans there.

The bias calculation measures in which direction, Republican or Democratic, a firm’s polls tended to miss. If a firm’s polls overestimated Mr. Obama’s performance in some states, and Mr. Romney’s in others, it could have little overall statistical bias, since the misses came in different directions. In contrast, the estimate of the average error in the firm’s polls measures how far off the firm’s polls were in either direction, on average.

Among the more prolific polling firms, the most accurate by this measure was TIPP, which conducted a national tracking poll for Investors’ Business Daily. Relative to other national polls, their results seemed to be Democratic- leaning at the time they were published. However, it turned out that most polling firms underestimated Mr. Obama’s performance, so those that had what had seemed to be Democratic-leaning results were often closest to the final outcome.

Conversely, polls that were Republican-leaning relative to the consensus did especially poorly.

Among telephone-based polling firms that conducted a significant number of state-by-state surveys, the best results came from CNN, Mellman and Grove Insight. The latter two conducted most of their polls on behalf of liberal- leaning organizations. However, as I mentioned, since the polling consensus underestimated Mr. Obama’s performance somewhat, the polls that seemed to be Democratic-leaning often came closest to the mark.

Several polling firms got notably poor results, on the other hand. For the second consecutive election — the same was true in 2010 — Rasmussen Reports polls had a statistical bias toward Republicans, overestimating Mr. Romney’s performance by about four percentage points, on average. Polls by American Research Group and Mason-Dixon also largely missed the mark. Mason-Dixon might be given a pass since it has a decent track record over the longer term, while American Research Group has long been unreliable.

FiveThirtyEight did not use polls by the firm Pharos Research Group in its analysis, since the details of the polling firm are sketchy and since the principal of the firm, Steven Leuchtman, was unable to answer due-diligence questions when contacted by FiveThirtyEight, such as which call centers he was using to conduct the polls. The firm’s polls turned out to be inaccurate, and to have a Democratic bias.

It was one of the best-known polling firms, however, that had among the worst results. In late October, Gallup consistently showed Mr. Romney ahead by about six percentage points among likely voters, far different from the average of other surveys. Gallup’s final poll of the election, which had Mr. Romney up by one point, was slightly better, but still identified the wrong winner in the election. Gallup has now had three poor elections in a row. In 2008, their polls overestimated Mr. Obama’s performance, while in 2010, they overestimated how well Republicans would do in the race for the United States House.

Instead, some of the most accurate firms were those that conducted their polls online.

The final poll conducted by Google Consumer Surveys had Mr. Obama ahead in the national popular vote by 2.3 percentage points – very close to his actual margin, which was 2.6 percentage points based on ballots counted through Saturday morning.

Ipsos, which conducted online polls for Reuters, came close to the actual results in most places that it surveyed, as did the Canadian online polling firm Angus Reid. Another online polling firm, YouGov, got reasonably good results.

The online polls conducted by JZ Analytics, run by the pollster John Zogby, were not used in the FiveThirtyEight forecast because we do not consider their method to be scientific, since it encourages voters to volunteer to participate in their surveys rather than sampling them at random. Their results were less accurate than most of the online polling firms, although about average as compared with the broader group of surveys.

We can also extend the analysis to consider the 90 polling firms that conducted at least one likely voter poll in the final three weeks of the campaign. One should probably not read too much into the results for the individual firms that issued just one or two polls, which is not a sufficient sample size to measure reliability. However, a look at this broader collective group of pollsters, and the techniques they use, may tell us something about which methods are most effective.

Among the nine polling firms that conducted their polls wholly or partially online, the average error in calling the election result was 2.1 percentage points. That compares with a 3.5-point error for polling firms that used live telephone interviewers, and 5.0 points for “robopolls” that conducted their surveys by automated script. The traditional telephone polls had a slight Republican bias on the whole, while the robopolls often had a significant Republican bias. (Even the automated polling firm Public Policy Polling, which often polls for liberal and Democratic clients, projected results that were slightly more favorable for Mr. Romney than what he actually achieved.) The online polls had little overall bias, however.

The difference between the performance of live telephone polls and the automated polls may partly reflect the fact that many of the live telephone polls call cellphones along with landlines, while few of the automated surveys do. (Legal restrictions prohibit automated calls to cellphones under many circumstances.)

Research by polling firms and academic groups suggests that polls that fail to call cellphones may underestimate the performance of Democratic candidates.

The roughly one-third of Americans who rely exclusively on cellphones tend to be younger, more urban, worse off financially and more likely to be black or Hispanic than the broader group of voters, all characteristics that correlate with Democratic voting. Weighting polling results by demographic characteristics may make the sample more representative, but there is increasing evidence that these weighting techniques will not remove all the bias that is introduced by missing so many voters.

Some of the overall Republican bias in the polls this year may reflect the fact that Mr. Obama made gains in the closing days of the campaign, for reasons such as Hurricane Sandy, and that this occurred too late to be captured by some polls. In the FiveThirtyEight “now-cast,” Mr. Obama went from being 1.5 percentage points ahead in the popular vote on Oct. 25 to 2.5 percentage points ahead by Election Day itself, close to his actual figure.

Nonetheless, polls conducted over the final three weeks of the campaign had a two-point Republican bias overall, probably more than can be explained by the late shift alone. In addition, likely voter polls were slightly more Republican-leaning than the actual results in many races in 2010.

In my view, there will always be an important place for high-quality telephone polls, such as those conducted by The New York Times and other major news organizations, which make an effort to reach as representative a sample of voters as possible and which place calls to cellphones. And there may be an increasing role for online polls, which can have an easier time reaching some of the voters, especially younger Americans, that telephone polls are prone to miss. I’m not as certain about the future for automated telephone polls. Some automated polls that used innovative strategies got reasonably good results this year. SurveyUSA, for instance, supplements its automated calls to landlines with live calls to cellphone voters in many states. Public Policy Polling uses lists of registered voters to weigh its samples, which may help to correct for the failure to reach certain kinds of voters.

Rasmussen Reports uses an online panel along with the automated calls that it places. The firm’s poor results this year suggest that the technique will need to be refined. At least they have some game plan to deal with the new realities of polling. In contrast, polls that place random calls to landlines only, or that rely upon likely voter models that were developed decades ago, may be behind the times.

Perhaps it won’t be long before Google, not Gallup, is the most trusted name in polling. (2 images)

Post Comment   Private Reply   Ignore Thread  


[Home]  [Headlines]  [Latest Articles]  [Latest Comments]  [Post]  [Mail]  [Sign-in]  [Setup]  [Help]  [Register] 

Please report web page problems, questions and comments to webmaster@libertysflame.com