Susan Frede, VP Research
In recent studies, Lightspeed Research has observed an inverse relationship between questionnaire length and completion rates. Initial observations show decreasing completion rates as questionnaire length increases, and an increased number of respondent dropouts for lengthier surveys.
How can this affect your research? Lower completion rates and higher dropout rates may lead to lower quality data and can impact business decisions. Longer questionnaires also may cause respondent fatigue and poorer quality responses. In some cases, respondents may not give as much consideration to questions at the end of a long questionnaire as to questions at the beginning. When fatigued respondents take short-cuts in their thinking, researchers are left with response bias.
Key Research Questions
In June of 2008 Lightspeed Research fielded research-on-research to answer the following questions:
- How long is too long?
- What is the relationship between questionnaire length and dropout rates?
- Are certain types of questions more likely to cause dropouts?
- As questionnaires get longer, how does length impact the representivity of the sample?
- How are key measures impacted as questionnaire length increases?
- Are panelists less satisfied with the survey experience with longer questionnaires?
- Does suspicious behavior increase as questionnaire length increases?
- How many and what types of questions can be asked before exposure to a stimulus without impacting key measure scores?
Six versions of a concept questionnaire were fielded for a new snack food idea. The length, as well as the order of the questions varied. The four main versions based on median completion time were:
- 8-minute questionnaire (ideal)
- 17-minute questionnaire
- 20-minute questionnaire
- 24-minute questionnaire
Two additional questionnaire versions asked extra screening questions prior to concept exposure. One version added approximately one minute and the other two minutes of interview time to the 8-minute questionnaire prior to concept exposure.
Response & Dropout Rates
Response rates are fairly consistent across the questionnaire lengths. However, dropout rates increase as questionnaire length increases.
Respondents dropping out are somewhat different demographically than those completing the entire survey. They are more likely to be older, retired, male and from smaller households without children. Those dropping out also are less likely to do all the grocery shopping and tend to purchase snack foods less often. Although the dropouts are different, they are so few in number that the overall sample is still representative when excluding them. Demographics, habits, and brand usage are generally consistent across the six questionnaire versions among those answering through at least purchase intent (the completion point). Thus, sample representivity does not appear to be impacted by questionnaire length.
While questionnaire length does not impact sample representivity, it does have the potential to impact concept measures. In the Lightspeed Research study, the concept scores for the 8-minute questionnaire version tend to be higher than scores for the longer questionnaires. The 20- and 24-minute questionnaire versions receive significantly fewer “definitely would buy” ratings compared to the 8-minute version. There are also several significant differences on liking, value, uniqueness, and believability between the 8 minute version and the longer versions. (See Table 2 on the following page.) It is important to keep the number of questions prior to concept exposure to a minimum. There are several significant differences between the 8-minute questionnaire version and the two versions with the extra upfront questions.
In the Lightspeed Research study, the shortest questionnaire versions (8-10 minutes) are viewed as more fun while the longer versions (17-24 minutes) are more likely to be seen as boring. On the longest questionnaire version (24 minutes) respondents are less likely to agree that this survey “helps companies make products that better meet people’s needs.”
Respondents taking the longer questionnaire versions (17-24 minutes) are more likely to report that they found questions difficult to answer (6-7% vs. 4% for the shorter versions). Regardless of questionnaire length, the three most frequent questionnaire complaints are:
- Number and type of answer choices – Respondents want to give honest, accurate answers, but sometimes feel that is not possible with the answer choices provided. Researchers sometimes want to force respondents to express an opinion, but it is important to keep in mind that this can be frustrating for respondents. Several respondents suggest including ‘don’t know’ or ‘none’ as an answer choice. It is not a good idea to include these choices on every question because it provides an easy out. However, if a list is not exhaustive then a ‘don’t know’ or ‘none’ choice can be valid.
- Answering questions based solely on reading a description of the product – Researchers need to tell respondents to base their answers on what they read. The more specific questions are, the harder it is for respondents to answer. For example, asking respondents to rate the flavor or aftertaste is very difficult without having tried the product.
- No ability to skip questions that don’t apply – If a respondent doesn’t use the category that is the subject of the questions, asking him/her multiple category specific questions can become frustrating even when “don’t use” is an answer choice. Consideration should be given to skipping questions that do not apply.
Because this questionnaire includes a mix of questions and it concerns a product category that most people use, no respondents commented that the questionnaire is repetitive. However, in other research, we have seen comments that repetitive questions negatively impact overall survey enjoyment. Respondents don’t see the need for repeating questions. Interestingly, those who say a survey is repetitive are also more likely to say it is too long.
Respondents completing a longer questionnaire are no more likely than those completing a short one to be classified as suspicious (exhibiting fraudulent or inattentive behavior). But, when a large number of questions are placed at the beginning of the questionnaire, it does lead to more suspicious behavior. The 8-minute questionnaire version has fewer respondents straight-lining an attribute battery (giving the same rating to all attributes) that falls late in the questionnaire.
Respondents claiming to be members of eight or more panels have been identified as professional respondents. There does not appear to be a relationship between number of panels and questionnaire length.
Impact of Upfront Questions
As previously mentioned, there are differences in key concept measures (Table 2) and increases in suspicious behavior (Table 4) when the initial screening section of the questionnaire is longer. There are also some significant differences in habit and attitude questions when asked before, as opposed to after, concept exposure. Frequency of purchasing and eating snack foods tends to be higher when asked after concept exposure. Also, for four of thirteen attitude statements about nutrition, ratings are significantly higher after concept exposure. Exposure to the snack concept could be biasing results, however, care also needs to be taken in placing too many questions prior to concept exposure as those questions can bias the concept measures. This suggests that it is important to be consistent in where questions are asked in the questionnaire, so that any bias remains consistent from one project to the next. None of the screening questions on the 9- and 10-minute questionnaire versions causes a large number of respondents to drop out. However, based on past research-on-research, long grid questions or several grid questions in a row early in the survey generally cause more dropouts. The 10-minute questionnaire has a single grid question upfront with only 13 items. Lightspeed Research has also seen more respondents drop out on allocation questions. These questions generally ask respondents to indicate how many of their last 10 purchases were for each brand. The longer the brand list, the longer it can potentially take to complete this question. The 20-and 24-minute questionnaire versions do include a very simple allocation question (17 brands) and there are several dropouts on this question.
This research-on-research shows that there is a relationship between questionnaire length and dropout rates. Dropout rates increase as questionnaire length increases. Although respondents who drop out are different from those who complete the entire questionnaire, overall sample representivity is not impacted by questionnaire length. Questionnaire length does affect key concept measures, so business decisions could be impacted. Shorter questionnaires are viewed as more fun while longer questionnaires are seen as boring. Also, respondents are more likely to report having difficulty answering questions in the longer
versions. Although straight lining behavior is slightly higher among all but the shortest questionnaires, longer questionnaires don’t necessarily lead to more suspicious behavior. Longer questionnaires don’t tend to appeal more to professional respondents because those completing the longer questionnaires are not on a greater number of panels. Asking more questions prior to concept exposure can cause biased concept results. When more questions are asked upfront, respondents are less likely to continue after the screening portion of the questionnaire and are more likely to exhibit suspicious behavior.
So how long is too long? Concept test results are impacted starting around 20 minutes. Therefore, Lightspeed Research recommends keeping online questionnaire length under 20 minutes. With shorter questionnaires, respondents are more likely to stay engaged and Lightspeed Research is able to establish better rapport with our panelists. Better rapport leads to less attrition and better quality responses. Some suggestions for reducing questionnaire length:
- Carefully evaluate each questionnaire to eliminate “nice to know” questions (i.e. questions that don’t directly relate to the objectives and success criteria).
- Keep the number of questions asked prior to concept exposure to a minimum.
- Think about questions from the respondent’s point of view – if you were a respondent would you want to answer? For example:
- Is it necessary to gather brand usage information down to nearly a SKU level? Can respondents accurately report this information?
- Can respondents answer specific attribute questions about a product when all they have seen is a concept?
- Use technology to make questionnaires more respondent friendly (e.g. to skip questions that don’t apply).
- Watch out for repetitious questions. You may think you are asking different questions (e.g. product is healthy and product is good for you), but respondents often don’t see it that way.
- Consider using split questionnaire designs to break a long questionnaire into manageable tasks. The second questionnaire is fielded to those returning the first questionnaire. This makes the task a little less daunting for respondents while still gathering all the information from the same respondents.
About Lightspeed Research
Lightspeed Research (www.lightspeedresearch.com) is the market researcher’s choice for digitally accessing and deriving insight from consumer opinions and behaviors whenever, wherever and in whatever segments needed. The industry’s most thorough panelist prescreening process and large global pool delivers business-ready results quickly and costeffectively. From proprietary online access panels to specialty panels, custom panels and innovative mobile surveys, Lightspeed Research offers the industry’s highest-quality and most complete combination of qualitative and quantitative online research. This is backed by an expert client operations team that provides a range of data collection services, from sample management and survey design to programming and reporting. Part of Kantar, a division of WPP, Lightspeed Research serves clients and cultivates online panelists across the Americas, Europe and Asia Pacific. Susan Frede is the VP of Research at Lightspeed Research. She has worked in the research field for 23 years, has published numerous research-on-research papers and is a well-respected speaker at key industry events. Some of the topics she has recently explored include questionnaire length, best practices for online research, suspicious and professional respondents and data stability. You can contact Susan at firstname.lastname@example.orgBack to Press Releases