Can We Trust Surveys & Data After the 2016 Election?

By December 2, 2016Strategy

Regardless of your political leanings, the 2016 election outcome was shocking. Why were the results so surprising?  Because the data was wrong. And in 2016, that’s not something we’re used to. It’s easier than ever to access the opinion of millions, it’s easier than ever to make sense of those opinions, so how could the polls have been so wrong, and how can we go back to trusting similar data in the future?

First, we need to identify why the polls failed us in the first place. There are three main flaws with poll data:

Polls and surveys, no matter how vast, are only one data point to rely on.

And they’re a qualitative data point at that. Even with qualitative data points, we face losing data from cross-device, lost-attribution, and straight up things slipping through the digital ether. This challenge is ten-fold when we are asking those surveyed to self-identify and determine the steps they think they would take. If you know how to read data, you need to hedge your bets and involve different analytics points in looking at your overall state of affairs. If you don’t know how to read data, you should hire someone that does before you drive yourself mad with what one customer says, or one customer does onsite in real time.

Polls and surveys tell us how people feel, not how they will act.

Most feel it is their “civic duty”  to vote, but still, 45 percent of voting-age Americans didn’t vote. Consumers might tell you that seeing more of X is important to them. But assuming that if we give them more of X, it will mean more conversions can be a naive misstep. Feelings are not fact. Emotions are not action. Remember, people tell us how they want to be viewed, not necessarily their reality. Another important point here is to think of the responses you’ll get from a survey. These will most likely be the most vocal (either mostly positive or negative), but not necessarily the most likely to become a customer or convert again. Check out any Yelp page or any business if you doubt this is true.

Polls and surveys can lead to premature analysis & action, which can be harmful to real success.

There’s a chance that Hillary Clinton supporters saw the strong lead, thought it meant a sure win, and avoided the election day lines. There’s also a chance you could be reading your survey that 20 percent of your customers answered and using that to outline your next marketing or merchandising plan. “Well, they said they wanted more pants! And at 20 percent off.” This is a mistake. Look at all of your data holistically and then test those ideas (then test again and again).

So, where do we, as people dependent on data and the opinions of customers, or potential customers, go from here? The first step to recovery is always acceptance. We have to accept that we cannot simply rely on our customers to self-identify. Look at both quantitative and qualitative data. What do we know about our customer based on analytics? What do they say about themselves? And how can we measure this going forward?

Look at your analytics and answer these questions:

  • Where are my customers coming from?
  • How are these customers behaving? Are my most popular channels my most profitable?
  • From my most profitable channels, what is the demographic data about the most common customer, and about the most profitable customer?
  • From my most profitable channels, what is the path that brings them there?
  • For my most common (highest session) channel, what is their path length?
  • Where in the U.S. are most of my customers? Are they coastal or region-specific?
  • What are my top performing products? Are they affected by seasonality?
  • What are my pages with the highest per session value? And what are my pages with the highest bounce rate?
  • How are sessions and revenue correlated over time?
  • What is my abandonment rates and how do they perform over time?

For questions you have that aren’t answered definitively by the above data, ask your customers. If you are left with questions, seemingly without an answer, or an inference you want to confirm or deny, ask your customers. Typeform is a great survey tool that has templates if you’re feeling stuck on how to phrase your question in a consumer-facing world.

Once you have your quantitative and qualitative data, you need to put it all back to the test. I’m going to say it again because this is so important: you need to test, test, and test! These tests will largely depend on what questions you had.

Unsure why a certain marketing channel isn’t converting, even though customers acquired through the channel say they are interested? Crazy Egg is a heat map tool that lets you sort by source so you can see exactly what these users are doing onsite, not just how long they are there. Having a long customer life cycle and not able to tie your success back to one source? Glew.io can help keep track of what customers are converting on what products, when, and how often. Knowing your customer’s lifetime value is crucial to online success. Think your converting customer may be different than those driving a lot of sessions? Install Quantcast on your thank you page to capture a more complete picture of your customer.

The list of possible solutions goes on and on. Long story short: We can and should trust polls, surveys, and customer feedback as long as we’re asking the right questions, measuring the results against quantitative data, and testing the results over and over again.

Have a data question not listed here? Contact me at shannon@hawkemedia.com and I’ll be happy to help!

© 2017 | All rights reserved Hawke Media.