Yes, The Election Polls Were Wrong. Here’s How We Fix It.

The cracks in the polling industry have been readily apparent for years and it was only a matter of time before we had a major polling miss in this country. On Monday I offered the following warning about the accuracy of the polls leading up to election day. I knew a surprise was coming thanks to my own online surveys throughout the year.

For years, I’ve been advising clients to move their polls from phone to online. Political and issue-oriented clients have been more resistant to the change, while private firms have embraced the lower cost, quick turnarounds, and improved accuracy offered by online surveys.

Much of the resistance from clients in the public affairs space (i.e. campaigns, associations, public affairs groups, etc.) is due to guidance from groups like FiveThirtyEight, AAPOR, AP, etc. who for years maligned many of the publicly available online polls.

Granted, in some cases their skepticism and criticism was warranted. But it was totally unfair and wrong to advise media and organizations to simply avoid “unreliable” online polls. There’s a lot of good online polling being done right now such as the USC Dornsife/L.A. Times tracking poll that correctly predicted Trump’s victory.

Another factor is resistance to change. Too many pollsters are wedded to phone polling due to the revenue streams associated with a methodology they have used for years that gave them significant professional and financial success.

Those days are now over.

People lack the time or patience to answer a 15-minute phone survey, and the respondents who do stick with it are probably not an accurate reflection of any group other than partisans or people who are natural joiners or volunteers.

Consumers have tools to avoid being interrupted by telephone polls via caller-id and blocking technologies just as they successfully avoid TV commercials with subscription services and time-shifting.

The ability to avoid or the lack of desire to participate in phone polls have led to record low response and cooperation rates. This has been common knowledge for years and it biases survey results. Adding cell phone interviews to this stew helps, but is not a cure, because it’s very expensive and difficult to reach the 40% to 50% of the population without a landline phone.

Respondents are simply more honest answering an online survey compared to surveys that are administered by a live interviewer over the phone. This is especially true when testing voting intentions involving two candidates with off-the-chart negative ratings competing in a highly charged media environment.

Short online surveys are the way to go.

Let me add one caveat.

Most online surveys use a panel of pre-recruited individuals or households who have agreed to take part in online market research. Typically, there are not enough of these panelists to conduct a statistically relevant poll in a smaller geography such as a congressional or state legislative district. However, online is a viable and a preferred option for statewide and larger congressional districts.

Okay. So, we should go ahead and convert our phone polls to online?

No.

You can’t just convert your phone poll to an online survey. It’s not that simple. Online respondents require different questions and different methods to interpret the results.

This was the mistake newspapers made in the early days of online news. They simply took their print product, which was declining in readership, and replicated it online. Not taking advantage of the new technology was a huge mistake. It’s the same for polling. Simply taking a phone poll and asking the same questions online is not advisable.

What do I suggest? I don’t pretend to have all of the answers, but these are some ideas that have worked for me.

Shift To Online Polling

There are just too many issues in phone polling, ranging from non-response (i.e. getting a representative sample of people to talk with you) to coverage (i.e. reaching certain segments of the population such as prepaid cell phone households)

Purchase quality online sample or grow your own online panels. Online panel quality matters. A lot. For example, a panel built from coupon clippers and entrants to online sweepstakes, contests, and giveaways will skew your results — unless that’s your target audience.

Respondents are more likely to be honest when answering an online poll. I’ve tested it and so have others. Use this to your advantage. For example, respondents are more likely to state their actual household income when answering an online survey. Ask for their household income in the survey and then use it to weight the data. Too many disregard income questions in phone polls fearing that respondents are less than honest in their answers. If you believe this, then don’t ask the question, or find a methodology such as online where people will accurately answer the question. Stop wasting peoples’ time. It reflects badly on all of us in the industry.

Proper Weighting

We know some population groups are over- or under-represented in a survey sample regardless of methodology. We have to do a better job of weighting (i.e. assigning “corrective” values to each one of the sample responses of a survey) regardless of mode (i.e. online, phone, mail, or in-person surveys) to ensure results reflect the profile of our desired audience for our survey.

This election provides a great example. YouGovUS, an online polling firm, polled throughout this election cycle. While they weighted their results across a host of demographic variables, it doesn’t appear they used household income in their weighting scheme. That was a mistake.

As you can see below, their samples skewed lower income, and Clinton outperformed with lower-income voters while Trump over performed with upper income households. Who knows? If they had factored income in their weightings, perhaps they would have had a better read on this election.

Surveys should be heavily weighted. Just using simple demographics such as gender, age, and ethnicity to weight a survey is insufficient. You have to include attitudinal and behavioral measures in addition to demos such as income if you have any chance of getting useable results. Harris Interactive did a lot of great work in this area in the early days of online polling.

Beware Digital and Social Media Signaling

Use digital and social media analytics to augment your polling, not to replace it. Digital plays a role and in some ways replaces qualitative research. However, in the current media environment, language has been weaponized, especially online. The data you collect from social media will reflect the socially desirable aspects of peoples’ personality or beliefs. People typically put on their best face online so it’s really difficult to determine what is real or simply social signaling.

Shorter Surveys and Polls

People lack the time or patience to answer a 15-minute survey and the respondents that do are probably not an accurate reflection of the overall population. Use third-party data or purchase demographic data from panel providers for variables you need to weight the data. If you have a lot to test, launch multiple short surveys instead of long surveys that respondents hate.

Vary Your Sampling

For political polling, don’t rely solely on verified registered voter files for your samples. Conversely, don’t rely on samples of the general population using screening questions to determine voting status or intent. Do both.

Pollsters are like generals, they’re always fighting the last war and every election is different. Chances are, you will miss. Tight screens might work for one cycle and then be absolutely the wrong option in another cycle. Vary your sampling and you will have a wider range of possible outcomes to review and analyze.

There are no easy solutions to cure what ails the polling industry. Technology has given us better analysis tools while making it more difficult to collect data to analyze.

Typically, when faced with these situations, people focus on what they know and keep doing the same things over and over until they’re forced to change.

Our jobs are too important to simply ignore the serious issues we now face or to leave it to the next generation to address. We need to change and evolve instead of deriding or attacking other ideas on how we should do our jobs.

The suggestions above are just that, suggestions. If you have a better mousetrap or idea, I would love to hear it, and I hope others in the industry are ready to be open-minded, too.