Longitudinal or panel surveys suffer from panel attrition which may result in biased estimates. Online panels are no exceptions to this phenomenon, but offer great possibilities in monitoring and managing the data-collection phase and response-enhancement features (such as reminders), due to real-time availability of paradata. This paper presents a data-driven approach to monitor the data-collection phase and to inform the adjustment of response-enhancement features during data collection across online panel waves, which takes into account the characteristics of an ongoing panel wave. For this purpose, we study the evolution of the daily response proportion in each wave of a probability-based online panel. Using multilevel models, we predict the data-collection evolution per wave day. In our example, the functional form of the data-collection evolution is quintic. The characteristics affecting the shape of the data-collection evolution are those of the specific wave day and not of the panel wave itself. In addition, we simulate the monitoring of the daily response proportion of one panel wave and find that the timing of sending reminders could be adjusted after 20 consecutive panel waves to keep the data-collection phase efficient. Our results demonstrate the importance of re-evaluating the characteristics of the data-collection phase, such as the timing of reminders, across the lifetime of an online panel to keep the fieldwork efficient.
Allison, P.D. (2001) Missing data (vol. 136), Thousand Oaks, CA: Sage.
Andreß, H.-J., Golsch, K. and Schmidt, A.W. (2013) Applied panel data analysis for economic and social surveys, Berlin: Springer.
Behr, A., Bellgardt, E. and Rendtel, U. (2005) Extent and determinants of panel attrition in the European Community Household Panel, European Sociological Review, 21(5): 489–512. doi: 10.1093/esr/jci037
Bethlehem, J.G. (2002) Weighting nonresponse adjustment based on auxiliary information, In R.M. Groves, D.A. Dillman, J.L. Eltinge and R.J.A. Little (eds), Survey nonresponse, Hoboken, NJ: John Wiley & Sons, pp. 41–54.
Billiet, J., Philippens, M., Fitzgerald, R. and Stoop, I.A.L. (2007) Estimation of response bias in the European Social Survey: using information from reluctant respondents in round one, Journal of Official Statistics, 23(2): 135–62.
Blom, A.G., Bosnjak, M., Cornilleau, A., Cousteaux, A.-S., Das, M., Douhou, S. and Krieger, U. (2016) A comparison of four probability-based online and mixed-mode panels in Europe, Social Science Computer Review, 34(1): 8–25. doi: 10.1177/0894439315574825
Blom, A.G., Bruch, C., Bossert, D., Felderer, B.I., Fickel, M., Funke, F., Gebhard, F., Herzing, J.M.E., Höhne, J.K., Holthausen, A., Krieger, U. and Rettig, T. (2018) German internet panel, wave 14 to 34. GESIS data archive. ZA5925 through ZA6954, SFB 884 – Political Economy of Reforms, University of Mannheim.
Blom, A.G., Gathmann, C. and Krieger, U. (2015) Setting up an online panel representative of the general population: the German internet panel, Field Methods, 27(4): 391–408. doi: 10.1177/1525822X15574494
Blom, A.G., Herzing, J.M.E., Cornesse, C., Sakshaug, J.W., Krieger, U. and Bossert, D. (2017) Does the recruitment of offline households increase the sample representativeness of probability-based online panels? Evidence from the German internet panel, Social Science Computer Review, 35(4): 498–520. doi: 10.1177/0894439316651584
Bosnjak, M., Dannwolf, T., Enderle, T., Schaurer, I., Struminskaya, B., Tanner, A. and Weyandt, K.W. (2018) Establishing an open probability-based mixed-mode panel of the general population in Germany: the GESIS panel, Social Science Computer Review, 36(1): 103–15. doi: 10.1177/0894439317697949
Cheng, A., Zamarro, G. and Orriens, B. (2018) Personality as a predictor of unit nonresponse in an internet panel, Sociological Methods & Research, online first. doi: 10.1177/0049124117747305
Couper, M.P. (2008) Designing effective web surveys, New York: Cambridge University Press.
Couper, M.P., Kapteyn, A., Schonlau, M. and Winter, J. (2007) Noncoverage and nonresponse in an internet survey, Social Science Research, 36(1): 131–48. doi: 10.1016/j.ssresearch.2005.10.002
Das, M., Toepoel, V. and van Soest, A. (2011) Nonparametric tests of panel conditioning and attrition bias in panel surveys, Sociological Methods & Research, 40(1): 32–56. doi: 10.1177/0049124110390765
Dennis, J.M. and Li, L. (2003) Effects of panel attrition on survey estimates, In Annual Meeting of the American Association for Public Opinion Research, May 17, 2003, Nashville, TN, http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.200.7528&rep=rep1&type=pdf
Dillman, D.A., Smyth, J.D. and Christian, L.M. (2014) Internet, phone, mail, and mixed-mode surveys: The tailored design method, Hoboken, NJ: John Wiley & Sons.
Elder, G.H. and Giele, J.Z. (2009) The craft of life course research, New York: Guilford Press.
Fan, W. and Zheng, Y. (2010) Factors affecting response rates of the web survey: a systematic review, Computers in Human Behavior, 26(2): 132–9. doi: 10.1016/j.chb.2009.10.015
Firebaugh, G. (2008) Seven rules for social research, Princeton, NJ: Princeton University Press.
Frick, J.R., Grabka, M.M. and Groh-Samberg, O. (2012) Dealing with incomplete household panel data in inequality research, Sociological Methods & Research, 41(1): 89–123. doi: 10.1177/0049124112440796
Glenn, N.D. (2005) Cohort analysis (vol. 5), Thousand Oaks, CA: Sage.
Göritz, A.S. (2014) Determinants of the starting rate and the completion rate in online panel studies, In M. Callegaro, R. Baker, J. Bethlehem, A.S. Göritz, J.A. Krosnick and P.J. Lavrakas (eds), Online panel research: A data quality perspective, Chichester: Wiley, pp. 154–70.
Göritz, A.S. and Crutzen, R. (2012) Reminders in web-based data collection: Increasing response rates at the price of retention?, American Journal of Evaluation, 33(2): 240–50.
Groves, R.M. (2006) Nonresponse rates and nonresponse bias in household surveys, Public Opinion Quarterly, 70(5): 646–75. doi: 10.1093/poq/nfl033
Groves, R.M. and Heeringa, S.G. (2006) Responsive design for household surveys: tools for actively controlling survey errors and costs, Journal of the Royal Statistical Society: Series A (Statistics in Society), 169(3): 439–57. doi: 10.1111/j.1467-985X.2006.00423.x
Groves, R.M. and Peytcheva, E. (2008) The impact of nonresponse rates on nonresponse bias a meta-analysis, Public Opinion Quarterly, 72(2): 167–89. doi: 10.1093/poq/nfn011
Groves, R.M., Presser, S. and Dipko, S. (2004) The role of topic interest in survey participation decisions, Public Opinion Quarterly, 68(1): 2–31. doi: 10.1093/poq/nfh002
Halaby, C.N. (2004) Panel models in sociological research: theory into practice, Annual Review of Sociology, 30: 507–44. doi: 10.1146/annurev.soc.30.012703.110629
Kaplowitz, M.D., Hadlock, T.D. and Levine, R. (2004) A comparison of web and mail survey response rates, Public Opinion Quarterly, 68(1): 94–101. doi: 10.1093/poq/nfh006
Laflamme, F., Maydan, M. and Miller, A. (2008) Using paradata to actively manage data collection survey process, In Section on survey research methods: American Statistical Association, Ottawa: Statistics Canada.
Liu, M. and Wronski, L. (2018) Examining completion rates in web surveys via over 25,000 real-world surveys, Social Science Computer Review, 36(1): 116–24. doi: 10.1177/0894439317695581
Lugtig, P. (2014) Panel attrition: separating stayers, fast attriters, gradual attriters, and lurkers, Sociological Methods & Research, 43(4): 699–723. doi: 10.1177/0049124113520305
Lugtig, P. and Blom, A.G. (2018) It’s the process stupid! Using machine learning to understand the relation between paradata and panel dropout, presented as a contributed paper at Methodology of Longitudinal Surveys II (MOLS), 25–27 July, Colchester.
Lynn, P. (2009) Methodology of longitudinal surveys, Hoboken, NJ: John Wiley & Sons.
Maas, C.J.M. and Hox, J.J. (2005) Sufficient sample sizes for multilevel modeling, European Journal of Research Methods for the Behavioral and Social Sciences – Methodology, 1(3): 86–92. doi: 10.1027/1614-2241.1.3.86
Malter, F. (2013) Fieldwork monitoring in the survey of health, ageing and retirement in Europe (SHARE), Survey Methods: Insights from the field, https://surveyinsights.org/?p=1974
Roßmann, J. and Gummer, T. (2016) Using paradata to predict and correct for panel attrition, Social Science Computer Review, 34(3): 312–32. doi: 10.1177/0894439315587258
Rosenbaum, P.R. and Rubin, D.B. (1983) The central role of the propensity score in observational studies for causal effects, Biometrika, 70(1): 41–55. doi: 10.1093/biomet/70.1.41
Rubin, D.B. (1987) Multiple imputation for nonresponse in surveys, Hoboken, NJ: John Wiley & Sons.
Schouten, B. and Shlomo, N. (2017) Selecting adaptive survey design strata with partial R-indicators, International Statistical Review, 85(1): 143–63. doi: 10.1111/insr.12159
Schouten, B., Peytchev, A. and Wagner, J. (2017) Adaptive survey design, Boca Raton, FL: CRC Press.
Stoop, I.A.L. (2005) The hunt for the last respondent: Nonresponse in sample surveys, (vol 8), The Hague: Sociaal en Cultureel Plan Bureau.
Vandecasteele, L. and Debels, A. (2007) Attrition in panel data: the effectiveness of weighting, European Sociological Review, 23(1): 81–97. doi: 10.1093/esr/jcl021
Vandenplas, C. and Loosveldt, G. (2017) Modeling the weekly data collection efficiency of face-to-face surveys: Six rounds of the European social survey, Journal of Survey Statistics and Methodology, 5(2): 212–32. doi: 10.1093/jssam/smw034
Vandenplas, C., Loosveldt, G. and Beullens, K. (2017) Fieldwork monitoring for the European social survey: an illustration with Belgium and the Czech Republic in round 7, Journal of Official Statistics, 33(3): 659–86. doi: 10.1515/jos-2017-0031
Vehovar, V., Batagelj, Z., Lozar Manfreda, K. and Zaletel, M. (2002) Nonresponse in web surveys, In R.M. Groves, D.A. Dillman, J.L. Etlinge and R.J.A. Little (eds), Survey nonresponse, Hoboken, NJ: John Wiley & Sons, pp. 229–42.
Wagner, J., West, B.T., Kirgis, N., Lepkowski, J.M., Axinn, W.G. and Kruger Ndiaye, S. (2012) Use of paradata in a responsive design framework to manage a field data collection, Journal of Official Statistics, 28(4): 477–99.
Watson, N. and Wooden, M. (2009) Identifying factors affecting longitudinal survey response, In P. Lynn (ed), Methodology of longitudinal surveys, Hoboken, NJ: John Wiley & Sons, pp. 159–67.
Weible, R. and Wallace, J. (1998) Cyber research: The impact of the internet on data collection, Marketing Research, 10(3): 19–24.
May 2022 onwards | Past Year | Past 30 Days | |
---|---|---|---|
Abstract Views | 618 | 464 | 65 |
Full Text Views | 20 | 3 | 0 |
PDF Downloads | 17 | 2 | 0 |
Institutional librarians can find more information about free trials here