White Paper: Completion Rates on Mobile Devices

WHITE PAPER: COMPLETION RATES ON MOBILE DEVICES

INFLUENTIAL FACTORS ON SURVEY OUTCOMES: LENGTH OF SURVEY, DEVICE SELECTION AND EXTERNAL ELEMENTS

Efrain Ribeiro, CRO and Stefan Kuegler, Director Mobile Research

 

EXECUTIVE SUMMARY

Historically, the length of a survey (LOI) has been linked with the overall completion rate and data quality. Consumers can now choose to a complete a survey on various types of Internet accessible devices.

This white paper addresses the influences the length of the interview on mobile devices.

 

LENGTH OF SURVEY

Many external factors can affect survey completion time: Wi-Fi access, location and network connectivity. Although these factors are beyond our control, these elements should be considered when determining LOI. However, several factors are under control in the form of survey design. The following two elements have the greatest impact on LOI:

1. The number of screens or questions respondent must answer
2. The complexity of the questions to be answered in terms of selection choices

 

NUMBER OF SCREENS

Both of these drive respondents’ experience within the survey and also their ability to complete the survey. With a PC, there is ample room on screen; therefore, a respondent can easily read or navigate between screens. These elements are reduced when we consider a mobile device, especially a phone.

The ability to move around the screen is reduced on smartphones and is even more difficult on feature phones. This ‘freedom’ of movement can impact the ability of the respondent to want to complete surveys. Thus the questions have to be succinct on screen and need to have fewer of them for respondents to scroll through.

Touch devices (mostly smartphones) might make it easier to navigate, but from experience we know that the interactions of smartphones are quick and frequent. Users do not dwell long on a mobile phone – for this activity people still return to the PC/laptop. However for those that only use the phone the more likely response is to stop and move onto to the next activity.

 

COMPLEXITY OF QUESTIONS

The question types or style can also impact the survey taking experience on the mobile device. Scrolling can be harder, especially on feature phones, and thus should be avoided. Short brand lists and few grids (if any) are recommended due to the nature of scrolling that might be required to see the full question. Respondents might be familiar with pinching and expanding their screens to see text and images better but this also introduces more manipulation on the part of the respondent which can also equate to greater abandonment. If this is reduced or kept to a minimum then it won’t have a detrimental impact but if frequent then respondents can get frustrated by the constant ‘fiddling’ on their devices.

Multimedia is also something that has to be considered for mobile very carefully. Loading time of images and videos can impact the likelihood of abandonment within the survey but it can also significantly increase the time taken by a respondent. Feature phones users will have limited time available on the internet or limited data plans so when something of this nature is within a survey we need to consider the likelihood that people will be able to access.

A recent study we asked respondents to take a picture at the end of the survey. Only 56% of smartphone respondents completed the task, for feature phones it was only 36%. With videos the drop-out rate was even more dramatic with between 60-70% respondent failing to make it to the end of the survey where a video was displayed.

 

SMARTPHONES: IMPACT OF COMPLETION RATES

As the length of a survey increases, the completion rate decreases. A survey’s completion rate is also significantly impacted by the choice of the consumer’s device. Android and iPhone users have the highest abandon rate – in some cases the abandonment is nearly double compared to a PC.

iPad users did not drop out as quickly; however, these users have a slightly higher rate of decline than PC users.

 

FEATURE PHONES: IMPACT OF COMPLETION RATES

To reach a representative audience in emerging markets, we will need to consider respondents with feature phones until smartphone use reaches a higher level of penetration. Compared to smartphones, feature phones have smaller screen size, lower processing power, less memory and are unable to handle large data files. In key emerging markets feature phones are still outselling smartphones; however, these countries often have unstable and fragile networks.

Many external factors can affect survey completion rates for feature phones, including length and design. In recent studies, we have seen the initial drop-out rate range between 30-50%. Feature phones lack processing power; therefore, images and HTML code does not load properly causing users to drop-out before even reaching the survey.

 

DEVICE SELECTION

Over the last 12 months, BlackBerry (BB), Android and iPhone users have longer survey length than Window/Mac PC users. See Table 2 below.

iPhone LOI is significantly lower than Android users due to the uniformity of iPhones compared to the vast selection of Android phones. Android phones can vary from high-end large screen tablets to small screen smartphones. Additionally, data shows iPads align closer to PC/laptops. BBs have a higher LOI average compared to other devices; however, BBs are sometimes considered as a feature phone.

TABLE 2: AVERAGE LOI FOR PAST 12 MONTHS ON MYSURVEY PANEL

WhitePaper_table1

Average of LOI
Android BlackBerry iPad iPhone Mac_PC Windows
All Surveys 24.35 24.29 18.90 21.93 18.87 19.58

 

MOBILE DEVICES COMPARED TO PC

Results were evaluated for the average survey LOI, but recorded on various devices. The figure below (Figure 2) compared the differences between each smartphone devices and PC/laptops. The information is an aggregated summation of data for the past 12 months; however, most of the surveys evaluated were not optimized for mobile.

Our findings indicate that for the same survey, mobile phone respondents took longer. On average, mobile phones users took 30-40% longer, which included several variables (i.e., phone type and survey complexity). Tablets, increasing 5-10%, did not have as significant of time acceleration.

For example, a survey that takes 5-10 minutes on a PC will take 30% more on an Android device, about 27% more on an iPhone and 5% more on an iPad. These results should be considered when we estimate LOI for mobile devices.

FIGURE 2: COMPARING THE LENGTH OF INTERVIEW OF SURVEY ON PC/LAPTOP WITH OTHER DEVICES

 

WhitePaper_chart1

 

SOLUTIONS

With the diverse landscape of mobile devices, we need to be careful with what we ask respondents to complete and still keep a measure of data quality.

Due to experience and published data, other supplies place restrictions on survey length and style of questions.

Smartphones might offer slightly easier movement and navigation around the screen, but there is still a limit to what respondents will do. Today, the right length for smartphone surveys are 10 minutes (or around 25-30 screens depending on complexity). The survey should be mobile optimized, including the following attributes.

• Few (or no) grids
• Minimal vertical scrolling [It is assumed that horizontal scrolling is avoided at all costs]
• Succinct wording on both question and answer text
• Answer list with less than 10 options

Feature phones are a step down again in terms of usability and navigation. For feature phones, we have to be more conservative and look to keep surveys to 6-8 minutes (which translates to 10- 15 screens). The greater step down in terms of number of screens to be shown is due to the increased difficulty with reading on a small screen and also the non-touch aspect of navigation which can increase the time to move between screens.

 

RESEARCH

We are currently looking into the effect of LOI on Data Quality. This is a study that will be run across multiple countries to understand how this factors impact each other. We will also review the impact of satisfaction and the level of drop out during the survey to try and understand where the optimum level of survey length might be for mobile devices.

There has been research done in this area over the past few years. Some of the most recent papers and related papers include:

Fine, B and Menictas, C (2012), The who, when, where and how of Smartphone research. AJMSR | V20, N2

Menig, M and Miller, C (2014), Our Evolving Ecosystem: A Comprehensive Analysis of Survey Sourcing, Platforms, and Lengths. True Sample conference 2014

Kachhi-Jiwani,D , Tucker, J and Wilding-Brown, L (2013) Managing mobile research: How it’s different and why it matters; Quirks March 2013

Stapleton, C E (2013), The Smartphone Way to Collect Survey Data, Vol. 6, no 2, 2013 | www.surveypractice.org

Callegaro, M. and T. Macer. (2011) Designing surveys for mobile devices. Presented at the American Association for Public Opinion Research Conference, Phoenix, AZ.

 

Back to Newsletter