Back to Programme

Eight seconds from Opine to Click – Respondent and Question Effects on Response Times in a large-scale Web Panel

Oliver Serfling (Faculty of Society and Economics, Rhine-Waal University of Applied Sciences)

Keywords: Methodological challenges and improvements, including in the areas of sampling, measurement, survey design and survey response or non-response

Abstract

The median web-survey participant needs 8 seconds to read, comprehend, and select on one out of up to ten alternative answer options - but what variables determine the respondents’ answer speed? Based on rational-choice theory and stratified samples drawn from 3.5 million users with more than 100 million responses from a German web-survey, linear and logistic regressions with panel data were run to test hypotheses on respondent behavior. This study identifies significant respondent-, questionnaire- and interaction-effects on their response times.

The findings show that respondents’ sociodemographic characteristics such as age, gender, marital status, and education have significant effects on their response times. Age has an inverse U-shaped effect, with minimal response-times in the thirties. The same non-linear pattern is found with respect to participation experience, measured by the number of polls conducted in the past and their length of panel-membership. Moreover, full-time employees and married respondents are the fastest, with divorced and self-employed responding slowest. Additionally, the response time is inversely related to the number of years of schooling. This is exacerbated when education is interacted with the complexity of the polling question, measured by its word count.

With regards to questionnaire effects, the results reveal increased response times for wordier questions and answer options. Furthermore, the response time increases with the average word size in the text. However, the amount of text in answer options can be faster processed if it is split on a larger numbers of answer options and if there is a larger variance in the word-length. Results on the relationship between education and the number of answer options indicate that less-educated participants tend to reduce their cognitive effort by selecting an answer at random leading to lower response times than those with higher levels of education. Examining the time when the survey took place, participants are found to react slower than average throughout the weekend and between 11pm and 4am, with the slowest being between 1am and 2am. They are quickest when surveyed between Tuesdays and Thursdays and between 9am and 10am, with considerably below-average response times from 7am to 3pm.

By providing a "don’t know"-option (DK) in a survey question, response-times are increased by approximately one second, leading us to conclude that the average user evaluates the question and answers carefully before choosing the “don’t know” option. We show that this DK-effect can be split up into (1) a response-time enhancing effect of the provision of a DK-option that exceeds the effect of an additional answer-option and (2) additional time to reason. However, there is also evidence that longer questions may slightly prompt the use of the DK-option.

The results contribute to the survey-research literature on respondent behavior and have practical implications for the design of online-surveys. Additionally, they can act as building blocks, leading towards an assessment of the reliability of users and their answers in an anonymous setting.