US: Explainer – our US political polling methodology
Who do we talk to?
All of Opinium’s election polling is conducted online via survey panels whereby people opt-in to take part in surveys on all sorts of subjects in exchange for vouchers once they complete a certain number of surveys. While politics is naturally going to come up more often in an election year, participants aren’t told what the survey is about before starting. The link they click could show them questions about politics but could easily be about what they plan to buy on black Friday or who they think should get kicked off The Bachelor that week. This means we run less risk of only speaking to political die-hards who don’t represent the whole population.
What do we ask?
After we’ve established that we’re talking to people who live in the United States and are aged 18+, we ask if they are registered to vote. If ‘yes’ then they see the block of questions that we use for voting intention:
- On a scale of 1-10, how likely are you to vote in the election?
- How do you plan to vote (e.g. in person, by mail, early)
- Who do you plan to vote for? (or “who have you voted for” if you have already voted)
- Which House of Representatives candidate would you vote for? (generic ballot)
The final “share of the vote” numbers are the results of that third question among those who say they are 10 out of 10 certain to vote.
How do we make the sample representative?
While the number of quotas and weights could be extensive, at a certain point practicality has to intervene and so we focus on the ones which we have found are statistically most likely to make a difference.
The factors we include then are:
- Gender and age group
- Region of the US (e.g. New England)
- Race (using the US census two-question formulation)
- Number of cars in your household
- Education level (high school, some college etc.)
- Registered to vote / not registered
- Past vote in 2016 and 2018 (among those who say they voted)
We ask registered voters about previous elections, filtered to those where they would have been old enough to vote at the time.
Social desirability bias can sometimes make people who didn’t vote, or aren’t registered, tick that they did so we frame our questions to remove any stigma. For instance, our “are you registered to vote” question mentions that over 80 million Americans aren’t registered.
Our question about past election first asks a yes/no question whether the respondent voted in each of those elections rather than simply showing a list of candidates and presenting having voted as the “normal” option. Only those that tick “yes” for, say, the 2016 election are then asked who they voted for in 2016.
The answers here are used to make our sample politically representative as well as demographically. Ordinarily, mid-terms have turnout too low to be useful for this exercise but 2018 was an exceptionally high turnout year so we feel comfortable that including it makes our sample more representative rather than less.
Finally, something we do not strictly quota on but do pay attention to is partisan registration or partisan lean. We ask two questions, whether or not you are registered as a Republican, Democrat, Independent or None, and whether you identify with either party or none. The decision to use past vote rather than these measures for weighting came down largely to the availability of better, more reliable data for past vote (namely, election results) and we have found that doing so means that the fallout for party identification tends to be reasonably consistent.
What do we do about shy Trump voters / social desirability bias?
This is a hypothesis that’s been put around throughout the election but which we don’t feel affects our results.
The theory is that in polls involving a live human interviewer (or IVR polls which could sound like a live human interviewer), some Trump-leaning voters may consciously or unconsciously believe that supporting the president is likely to lead to them being judged or looked down upon by the interviewer.
Social desirability bias and interviewer effects are well known phenomena in survey research more generally and it’s the reason why you may have to couch sensitive subjects with this in mind and, sometimes, use multiple questions to get to the truth rather than just asking directly.
However, when people take part in Opinium surveys, they are seeing question text on a screen and either clicking the answer they agree with or typing a response into a text box. The interview is impersonal with the “interviewer” having no face or name and we reassure participants at various points that their answers will be anonymous and only looked at in the aggregate.
If this “shy Trump voters” effect was widespread then one would expect more anonymous methods like ours or other online polls to show better results for the president than live caller polls. Yet not only does such a persistent gap not exist, but our final poll has some of the worst numbers for the president of any public polling. Therefore, while there are many things which could affect the accuracy of our results, social desirability bias impeding people telling us who they currently support is unlikely to be one of them.
An area where social desirability may play a part, although this is impossible to verify, is in past vote. Opinium weights our political samples by how people say they voted in the last two national elections, the 2018 mid-terms and the 2016 presidential election. Something we encountered in the UK was that past vote can sometimes become a less reliable measure if people become less comfortable with how they had voted. In this instance, some people who voted for the Labour party in 2017 had, by 2019, become sufficiently disillusioned with the party that they were no longer telling us that they voted for it. This meant that in getting the right number of 2017 Labour voters in our sample, we were actually overrepresenting those who had not become disillusioned and, therefore, overstating Labour’s performance in voting intention in 2019.
Opinium UK dealt with that by using past vote data collected at the time as much as possible but, for the moment, Opinium US doesn’t have that option. Therefore, it is sensible to think about whether Trump 2016 or Clinton 2016 voters might have reason to downplay how they voted in that election and how, if at all, that may affect our final numbers.