Being able to gather survey data through online panels is a wonderful option! An entire, world-wide industry has migrated to online research methods to gain access to thousands of survey respondents from all walks of life, and given us complex programming capabilities, among other things.
Compared to conducting quantitative surveys by telephone, by mail, or in person, online survey methods are faster, less expensive, allow for complex processes during surveys, such as photo displays and video streaming, and permit the running of complex survey protocols in the background.
And the data we get from online survey panels is just as accurate
as those older methods – or is it?
What if you discovered that somewhere between 10% and 30% of your survey data gathered through online panels was untrustworthy? That would mean that you based important decisions on inaccurate information. And if no one paid personal attention to your data file, this has probably happened to you, and you just didn’t realize it.
As we moved into online survey research years ago, we looked closely at our survey data files, searching for “gems” in the open-ended comments provided by respondents. But through this close scrutiny, we also found to our surprise that an unacceptable percentage of surveys were…well, let’s say “unreliable.”
And this did not happen to us once, or with just one data collection provider. It happened just about every time with every vendor for many years now – and continues today. So if it happened to us over and over, it’s probably happened to you.
It is sad to say, but by “unreliable,” we really mean those dishonest survey respondents who have found ways to beat the system without being caught so that they still receive their incentives from the field house. They make sure that the survey stays in their browser long enough to avoid detection by activity time. They answer every question in some simple albeit thoughtless way. Their responses to open-ended questions pass muster (e.g., “no,” “good,” “great”), but often don’t answer the actual question because the respondent never bothered to read the actual question. They know that there may be traps in the survey to catch “speedsters” and know to keep an eye out for those tricks. Really – they have the moves down cold.
The field houses are well aware of this issue and have developed useful routines for catching respondent inaccuracies and deceptions. However, it has been our experience that these processes seldom can replace the judgment of a qualified human observer.
So take a closer look at your survey data. Reviewing the survey data in an Excel spreadsheet gives you a number of useful, visual options. You can expand the size of specific columns to more easily evaluate open-ended responses or hide columns of data altogether if they are not needed for the time being. You can use shading and colors to highlight sets of data you consider questionable so that you can return to the line item – the row of one person’s answers – for closer inspection.
You can insert columns where you want and use Excel formulas and functions (e.g., SUM(), COUNTIF(), IF/THEN/ELSE, etc.) that help you examine the data based on the design of your particular questionnaire. For example, if a respondent types in a 3 over and over in response to a question grid of ratings, might that imply that he or she did not bother to read the line items at all? Use an Excel formula to quickly identify which respondents exhibited that behavior.
But use caution and judgment before deciding to delete a respondent’s data. That series of 3’s might be legitimate responses. It is always best to review the entire set of that person’s responses if you find a suspicious occurrence.
If all of this data-checking sounds like something you’d rather not do yourself (I had a client the other day say something about preferring a root canal over this extensive data-checking process), give us a call at Brand Planning. We’ll work hard to maintain the quality of your survey data – and we’ll enjoy doing it for you!