Liars, damned liars, and people who respond to opinion polls

Stats Talk
May 29, 2015
Liars, damned liars, and people who respond to opinion polls

“If this exit poll is right, I will publicly eat my hat on your programme,” said Paddy Ashdown on the BBC’s election night coverage. People on all sides struggled to believe the exit polls and came up with all sorts of explanations, until it turned out it was the opinion polls that got it wrong. There are several possibilities: people lied in all the polls before the exit poll or loads of people changed their minds once they had the pencils in their hands or people voted tactically. Maybe exit polls are just better, but they’re definitely different.

What is exit polls

Exit polls (http://www2.warwick.ac.uk/fac/sci/statistics/staff/academic-research/firth/exit-poll-explainer) are not just a proxy count of votes: they are modelled along with results of previous elections, previous local exit polls, and swings in party support. Results of previous elections are important once voter intransigence is assumed, that is, if people are likely to vote for the same party as last time. In one sense, an exit poll is a panel survey and inter-election exit poll comparisons are essentially trackers. In the same way, they are subject to the importance of sampling and of maintaining the same participants; in this case the ‘participant’ is the polling station rather than the individual. The sample for an exit poll is between 100 and 200 voters at about 100 polling stations, a total of more than 20,000. Most polling locations are retained from election to election but there can be changes to attend to changes in electorate, for example. As with all opinion polls, exit polls assume some error and build it into the regression models. The models then produce probabilities for each candidate winning each seat, with confidence intervals around predictions of both final percentages and seat numbers.

The important difference between exit polls and most opinion polls is in the questions posed. Exit polls can ask about actual votes for actual candidate names but national-level opinion polls tend to ask about parties. Party-prompt polls seem better suited to systems using electoral lists where votes are for parties whose members are ranked, so the higher one is up the list, the higher the probability of being elected. Some countries’ rules allow the popular vote to influence the order in which a party’s candidates are listed, so candidate-specific voting is possible but less powerful than in First-Past-the-Post (FPP) or proportional representation.

Curiously, one outlier poll actually got very close to the actual results, but it was so different from the preceding polls that Survation (http://survation.com/snatching-defeat-from-the-jaws-of-victory/?utm_source=dlvr.it&utm_medium=twitter) “chickened out” of publishing it. There’s nothing magic about what Survation did: it was a 1,000-respondent nationally representative telephone poll the day before the election that listed the candidates in each respondent’s constituency and asked how they planned to vote. At that juncture, there was less scope for confounding and, possibly, more social desirability pressure to keep one’s word. However, FiveThirtyEight (http://fivethirtyeight.com/liveblogs/uk-general-election-2015/?#livepress-update-21944073) modelled both candidate prompts and party prompts concluded, that generic questions about party-level voting intention were more accurate than specific ones.

So, was lots of people just fibbing?

An oft-repeated explanation in the past couple of weeks has been that Conservative and UKIP voters were “shy” in their responses to opinion polls. Eric Kaufman (http://blogs.lse.ac.uk/politicsandpolicy/the-shy-english-nationalists-who-won-it-for-the-tories-and-flummoxed-the-pollsters) points to other research that indicates reluctance on the part of UKIP supporters to answer questions about class, and that shows lower levels of trust in others. Evidence from polls of Scottish voters can, at least, eliminate the simple shy-Tory explanation: estimates of Conservative votes were consistent with the election result and it was the SNP that was under-estimated.

Tactical voting another peculiarity of the FPP system that may have had an impact on the gap between the opinion and exit polls. The process might go something like this: I’d like to vote for the Party X so that’s what I’ll tell the pollsters, but I live in constituency where Party Y holds the seat and where only Party Z can challenge them. I don’t want Party Y to win so I’ll vote for Z instead. (If it helps, insert UKIP, LAB, and CON for X, Y, and Z.) An attempt as mass vote-switching was orchestrated by a certain right-wing daily newspaper which ran a headline a couple of weeks before the election instructing citizens in 50 constituencies how to vote tactically to “help keep Labour out of Number 10”. Whether it worked or not is of less concern than whether a campaign of this sort should be considered a confounding variable in analysis of polls.

It’s very confusing, this exit poll business. People might have lied, or been shy, or voted tactically but the main difference is methodological. Think about between becoming engaged and getting married, each based on a different question. About 15% of engagements (http://content.time.com/time/magazine/article/0,9171,490683,00.html) are called off each year, and about 15% of those polled got cold feet, or changed their minds, or lied. If an opinion poll is “Will you marry me?” an exit poll is “Do you take this man to be your lawful wedded prime minister?”

Related posts

Electoral collage

Opinion polls

Featured Jobs

Greater London Authority

London SE1 2AA

November 25, 2018

AstraZeneca

Cambridge

December 16, 2018

University of Leeds

Leeds, UK

December 06, 2018

AXA

Tunbridge Wells

December 06, 2018

Yale-NUS College

Singapore

November 23, 2018

The Energy Systems Catapult

Birmingham, UK

December 16, 2018

Cabinet Office

London

November 25, 2018

The General Medical Council

Manchester, UK

November 25, 2018

SPD Development Company Ltd

Bedford, UK

November 23, 2018

AstraZeneca

Cambridge, UK

December 16, 2018

The Office for Students (OfS)

Bristol, UK

November 20, 2018

Canal & River Trust

Birmingham, UK

December 06, 2018

The Welsh Government

Conwy LL31 9RZ

November 20, 2018

Massachusetts Institute of Technology (MIT)

Cambridge, MA

December 05, 2018

University of Leeds

Leeds, UK

December 06, 2018

University of Wollongong

Wollongong, NSW, Australia

December 02, 2018

National Audit Office

London

December 02, 2018

Abbott Diabetes Care

Witney, UK

December 09, 2018

Our Partners

Logo for Logo University Of Manchester
Logo for Yougov
Logo for Ministry
Logo for Ons Logo
Logo for Un
Logo for Office Depot
Logo for Mit Logo

Like what you see?

Post a job