Skip to 0 minutes and 9 seconds All right, so we’ve got various challenges to election prediction. So maybe we can try to consolidate a little bit here and list the challenges in rough order of importance and also think a little bit about whether there are some challenges we haven’t yet mentioned. So what would be your top four list or top three list? OK, well I can give you a list that works backwards, and then I might talk about importance. So if we work backwards, the thing most people want to know about is the actual result of the election in seats or who’s the winner. That’s going to depend on how the votes are distributed throughout the country.
Skip to 1 minute and 0 seconds So we get into the details of the electoral college or UK constituencies or these other things. So that’s like the last stage in the process, doing your modelling of where you think the votes are going to land. So that might be one way of going wrong. If you get your modelling wrong, you could mess that stage up. So that’s the last stage, moving from votes to seats or outcomes.
Skip to 1 minute and 24 seconds Moving back a stage, we’ve got this problem of people saying they’re going to turn out to vote and not actually doing it. So people might truthfully report that they would vote for the labour or conservative parties if they did go out to vote. And they might say that they’re going to vote. But in the end, come the day of the election, they don’t do it. So that turnout stage– that might be another way of getting it wrong. Now, that turnout stage is easier where you’ve got really high turnout. If you do it like they do in Australia, if you make voting compulsory, suddenly things get a lot easier. That stage becomes a lot less problematic.
Skip to 2 minutes and 6 seconds But in the UK, where turnout has gone between 60%, 70%, getting that turnout stage wrong could still throw off your forecast of what’s going to happen. So that’s all, in a sense, once you’ve got your data. If we’re working our way backwards, then the next stage is when we’re actually asking people, who are you going to vote for? We might be calling them up. We might be doing it online. But there’s always going to be a presumption that these people are responding truthfully or responding with their truth at the time of asking. But if people are downright malevolent, and they just think, well, this polling company is calling me up– I’m going to mess with them.
Skip to 2 minutes and 58 seconds I’m a conservative voter, but I’m going to tell them that I’m going to vote labour. That would be a problem. I don’t think there are many people who are like that. But there might be more people who just want to conceal a little bit their vote. So sometimes you get this with extremist parties. So if you take a party like the British National Party, it’s an overtly racist party. It’s quite rare that you’d call someone up on the phone, and if they are minded to vote for that party that they would be open about that. They might just say, oh, I don’t know. I don’t know.
Skip to 3 minutes and 40 seconds And that saves them that social pressure of having to say to someone, yeah, I’m voting for this party, and they’re quite extreme. Now, in the UK in the 1992 election there were some people that said that’s what happened when the polling companies underestimated the conservative share of the vote. People thought there might be a lot of shy Tories, people who thought, well, the conservative party’s got a reputation of being a nasty party. I’m not a nasty person. I wouldn’t vote for a nasty party. So when someone calls me up, I’m not going to say that I’m going to vote for the conservative party. I’m going to say that I’m undecided. So that’s a risk.
Skip to 4 minutes and 23 seconds When you ask people something, you have to rely on them being somewhat truthful. There are ways of guarding against that risk. So some of the mechanisms that I just talked about in terms of social pressure, people not wanting to confess to unpopular or minority opinions– they might be less common where it’s just you in front of a computer. If you have to tell someone something over the phone, you might conceal things that you’d be happy admitting online. Because on the internet, who knows who you are? So that honest reporting– that’s way back in the chain.
Skip to 5 minutes and 3 seconds And then at the beginning of the chain, you’ve got getting your sample right and making sure that you’re getting in touch with the right kind of people. Because when opinion polling started, this idea that you could get a scrupulously random sample from the population– that was still viable. And once telephones reached mass penetration, you could do things like random digit dialling. You could get a random sample of telephone users. They would be a good guide to the population. And when you called people up, they would respond. They wouldn’t just slam the phone down. That’s not really possible now, certainly for telephone surveying. Response rates will be in the single digits.
Skip to 5 minutes and 55 seconds Meaning less than one in ten of the people that you’re calling are actually agreeing to answer the survey. So it’s very hard to get people to answer. And because it’s hard to get people to answer, those people who do answer, they’re weird in some way.
Skip to 6 minutes and 16 seconds And that’s not problematic if they’re weird in ways that we understand. So let’s say I’m calling people up. I’m doing some telephone surveys, and I’m calling during the day. And the people I get during the day, they might be lots of pensioners who don’t have a job to go to and maybe lots of students. Although students may not have a landline phone. So I might get this really weird age distribution. But if I know about that, I can counter that. I can down-weight the responses from elderly and younger groups and up-weight those middle-aged voters that I’m missing out on.
Skip to 6 minutes and 58 seconds The problem comes if people are– these telephone responders– if they’re weird in some other way that’s hard for us to account for or adjust for. So interest in politics is going to be the obvious one. I teach undergraduate students of politics. And one of the things I repeatedly say to them is you are weird. Most people are not interested in politics. Most people have very, very low levels of knowledge. You guys are going to be disproportionately likely to be interested in things. You might be the kind of people who pick up the phone, who are happy talking for 15 or 20 minutes to someone over the phone about your political opinions.
Skip to 7 minutes and 42 seconds But that means that if the people who are responding to this are more interested in politics than the general population, then the results of that sample are going to be skewed in some way.
Potential Pitfalls for Election Prediction
This interview nicely complements my video from step 2.14, but with an exclusive focus on election prediction. Here’s my summary of the key points, which mix together both old and new ideas for us.
- Election polls often measure the wrong thing since they are often national even when elections are decided constituency by constituency.
- Predicting who will vote is crucial for election prediction but hard to do.
- Respondents might lie, or just be unintentionally inaccurate, about various things such as their likelihoods of voting or which candidates they support. It is often claimed, for example, that there is a “shy Tory effect” in UK politics whereby some conservatives, especially young ones, hide their leanings.
- For various reasons you may not be able to complete interviews with many people who have been randomly selected into your sample. For example, some people may not pick up the phone or may hang up when they learn that a pollster is on the line.
Is problem 4 really such a big issue?
If, for example, a pollster wants a sample of 2,000 people and is able to complete 1 interview out of every 10 attempts then can’t she just attempt 20,000 interviews with a goal of taking 2,000 of these interviews to completion?
© Royal Holloway, University of London