Longitudinal and Life Course Studies
An international journal

Preventing interview falsifications during fieldwork in the Survey of Health, Ageing and Retirement in Europe (SHARE)

Authors:
Michael BergmannMunich Center for the Economics of Aging (MEA), Max Planck Institute for Social Law and Social Policy, Germany

Search for other papers by Michael Bergmann in
Current site
Google Scholar
Close
,
Karin SchullerMunich Center for the Economics of Aging (MEA), Max Planck Institute for Social Law and Social Policy, Germany

Search for other papers by Karin Schuller in
Current site
Google Scholar
Close
, and
Frederic MalterMunich Center for the Economics of Aging (MEA), Max Planck Institute for Social Law and Social Policy, Germany

Search for other papers by Frederic Malter in
Current site
Google Scholar
Close
Restricted access
Get eTOC alerts
Rights and permissions Cite this article

The fabrication of an entire interview, is a rare event in the Survey of Health, Ageing and Retirement in Europe (SHARE) but can nevertheless lead to negative consequences regarding the panel sample, such as a loss in sample size or the need for time-consuming data corrections of information collected in previous waves. The work presented in this article started with the discovery of a case of interviewer fabrication after fieldwork for the sixth wave of SHARE was completed. As a consequence, we developed a technical procedure to identify interview fabrication and deal with it during ongoing fieldwork in the seventh wave. Unlike previous work that often used small experimental datasets and/or only a few variables to identify fake interviews, we implemented a more complex approach with a multivariate cluster analysis using many indicators from the available CAPI data and paradata. Analyses with the known outcome (interview fabrication or not) in wave 6 revealed that we were able to correctly identify a large number of the truly faked interviews while keeping the rate of ‘false alarms’ rather low. With these promising results, we started using the same script during the fieldwork for wave 7. We provided the survey agencies with information for targeted (instead of random) back checks to increase the likelihood of confirming our initial suspicion. The results show that only a very small number of interview fabrications could be unequivocally identified.

  • AAPOR (American Association for Public Opinion Research) (2016) Standard definitions: Final dispositions of case codes and outcome rates for surveys, Oakbrook Terrace, IL: AAPOR.

    • Search Google Scholar
    • Export Citation
  • Allen, M. (2017) The SAGE encyclopedia of communication research methods, Thousand Oaks: Sage.

  • Bergmann, M., Kneip, T., De Luca, G. and Scherpenzeel, A. (2019) Survey participation in the Survey of Health, Ageing and Retirement in Europe (SHARE), wave 1–7, SHARE Working Paper Series 41-2019, Munich: SHARE-ERIC.

    • Search Google Scholar
    • Export Citation
  • Blasius, J. and Thiessen, V. (2013) Detecting poorly conducted interviews, In P. Winker, N. Menold and R. Porst (eds), Interviewers’ deviations in surveys: Impact, reasons, detection and prevention, Frankfurt am Main: Peter Lang, pp. 6788.

    • Search Google Scholar
    • Export Citation
  • Blasius, J. and Thiessen, V. (2015) Should we trust survey data? Assessing response simplification and data fabrication, Social Science Research, 52: 47993. doi: 10.1016/j.ssresearch.2015.03.006

    • Search Google Scholar
    • Export Citation
  • Blasius, J. and Thiessen, V. (2018) Perceived corruption, trust, and interviewer behavior in 26 European countries, Sociological Methods & Research, Online First: https://doi.org/10.1177/0049124118782554

    • Search Google Scholar
    • Export Citation
  • Börsch-Supan, A., Brandt, M., Hunkler, C., Kneip, T., Korbmacher, J., Malter, F., Schaan, B., Stuck, S. and Zuber, S. (2013) Data resource profile: The Survey of Health, Ageing and Retirement in Europe (SHARE), International Journal of Epidemiology, 42(4): 9921001.

    • Search Google Scholar
    • Export Citation
  • Bredl, S., Storfinger, N. and Menold, N. (2013) A literature review of methods to detect fabricated survey data, In P. Winker, N. Menold and R. Porst (eds), Interviewers’ deviations in surveys: Impact, reasons, detection and prevention, Frankfurt am Main: Peter Lang, pp. 324.

    • Search Google Scholar
    • Export Citation
  • Bredl, S., Winker, P. and Kötschau, K. (2012) A statistical approach to detect interviewer falsification of survey data, Survey Methodology, 38(1): 110.

    • Search Google Scholar
    • Export Citation
  • Bushery, J.M., Reichert, J.W., Albright, K.A. and Rossiter, J.C. (1999) Using date and time stamps to detect interviewer falsification, JSM Proceedings, Survey Research Methods Section. Alexandria, VA: American Statistical Association, pp. 31620.

    • Search Google Scholar
    • Export Citation
  • De Luca, G., Rossetti, C. and Malter, F. (2015) Sample design and weighting strategies in SHARE wave 5, In F. Malter and A. Börsch-Supan (eds), SHARE wave 5: Innovations and methodology, Munich: MEA, Max Planck Institute for Social Law and Social Policy, pp. 7584.

    • Search Google Scholar
    • Export Citation
  • Ericksen, E.P. and Kadane, J.B. (1985) Estimating the population in a census year: 1980 and beyond – rejoinder, Journal of the American Statistical Association, 80(389): 98109.

    • Search Google Scholar
    • Export Citation
  • ESS (2016) ESS7: 2014 documentation report, London: ESS ERIC.

  • Härdle, W.K. and Simar, L. (2015) Applied multivariate statistical analysis (4th edn), Heidelberg: Springer.

  • Hood, C.C. and Bushery, J.M. (1997) Getting more bang from the reinterviewer buck: Identifying ‘at risk’ interviewers, JSM Proceedings, Survey Research Methods Section, Alexandria, VA: American Statistical Association, pp. 82024.

    • Search Google Scholar
    • Export Citation
  • Japec, L. (2006) Quality issues in interview surveys: some contributions, Bulletin of Sociological Methodology, 90(1): 2642. doi: 10.1177/075910630609000104

    • Search Google Scholar
    • Export Citation
  • Kaminska, O., McCutcheon, A.L. and Billiet, J. (2010) Satisficing among reluctant respondents in a cross-national context, Public Opinion Quarterly, 74(5): 95684. doi: 10.1093/poq/nfq062

    • Search Google Scholar
    • Export Citation
  • Kaufman, L. and Rousseeuw, P.J. (2005) Finding groups in data: An introduction to cluster analysis, Hoboken, NJ: John Wiley & Sons.

  • Koch, A. (1995) Gefälschte interviews: Ergebnisse der Interviewerkontrolle beim ALLBUS 1994, ZUMA-Nachrichten, 19(36): 89105.

  • Koczela, S., Furlong, C., McCarthy, J. and Mushtaq, A. (2015) Curbstoning and beyond: Confronting data fabrication in survey research, Statistical Journal of the IAOS, 31(3): 41322. doi: 10.3233/SJI-150917

    • Search Google Scholar
    • Export Citation
  • Krosnick, J.A. and Alwin, D.F. (1987) An evaluation of a cognitive theory of response-order effects in survey measurement, Public Opinion Quarterly, 51(2): 20119. doi: 10.1086/269029

    • Search Google Scholar
    • Export Citation
  • Kuriakose, N. and Robbins, M. (2016) Don’t get duped: Fraud through duplication in public opinion surveys, Statistical Journal of the IAOS, 32(3): 28391. doi: 10.3233/SJI-160978

    • Search Google Scholar
    • Export Citation
  • Li, J., Brick, J.M., Tran, B. and Singer, P. (2011) Using statistical models for sample design of a reinterview program, Journal of Official Statistics, 27(3): 43350.

    • Search Google Scholar
    • Export Citation
  • Menold, N., Winker, P., Storfinger, N. and Kemper, C.J. (2013) A method for ex-post identification of falsifications in survey data, In P. Winker, N. Menold and R. Porst (eds), Interviewers’ deviations in surveys: Impact, reasons, detection and prevention, Frankfurt am Main: Peter Lang, pp. 2547.

    • Search Google Scholar
    • Export Citation
  • Murphy, J., Baxter, R., Eyerman, J., Cunningham, D. and Kennet, J. (2004) A system for detecting interviewer falsification, JSM Proceedings, Survey Research Methods Section, Alexandria, VA: American Statistical Association, pp. 496875.

    • Search Google Scholar
    • Export Citation
  • Murphy, J., Biemer, P., Stringer, C., Thissen, R., Day, O. and Hsieh, Y.P. (2016) Interviewer falsification: current and best practices for prevention, detection, and mitigation, Statistical Journal of the IAOS, 32(3): 31326. doi: 10.3233/SJI-161014

    • Search Google Scholar
    • Export Citation
  • Rokach, L. and Maimon, O. (2005) Clustering methods, In O. Maimon and L. Rokach (eds), Data mining and knowledge discovery handbook, Boston, MA: Springer, pp. 32152.

    • Search Google Scholar
    • Export Citation
  • Schäfer, C., Schräpler, J.-P., Müller, K.-R. and Wagner, G.G. (2005) Automatic identification of faked and fraudulent interviews in the German SOEP, Schmollers Jahrbuch: Journal of Applied Social Science Studies/Zeitschrift für Wirtschafts-und Sozialwissenschaften, 125(1): 18393.

    • Search Google Scholar
    • Export Citation
  • Schräpler, J.-P. (2004) Respondent behavior in panel studies: a case study for income nonresponse by means of the German Socio-Economic Panel (SOEP), Sociological Methods & Research, 33(1): 11856.

    • Search Google Scholar
    • Export Citation
  • Schräpler, J.-P. (2010) Benford’s law as an instrument for fraud detection in surveys using the data of the Socio-Economic Panel (SOEP), SOEP Papers on Multidisciplinary Panel Data Research, No. 273, Berlin: DIW.

    • Search Google Scholar
    • Export Citation
  • Schräpler, J.-P. and Wagner, G.G. (2003) Identification, characteristics and impact of faked interviews in surveys: An analysis by means of genuine fakes in the raw data of SOEP, IZA Discussion Paper No. 969, Berlin: DIW.

    • Search Google Scholar
    • Export Citation
  • Schreiner, I., Pennie, K. and Newbrough, J. (1988) Interviewer falsification in census bureau surveys, JSM Proceedings, Survey Research Methods Section, Alexandria: VAL American Statistical Association, pp 4916.

    • Search Google Scholar
    • Export Citation
  • Schupp, J. (2018) Umfragebasierte Studien: ‘Fake-Interviews’ bleiben die Ausnahme, DIW Wochenbericht No. 6, Berlin: DIW.

  • Shaeffer, E.M., Krosnick, J.A., Langer, G.E. and Merkle, D.M. (2005) Comparing the quality of data obtained by minimally balanced and fully balanced attitude questions, Public Opinion Quarterly, 69(3): 41728. doi: 10.1093/poq/nfi028

    • Search Google Scholar
    • Export Citation
  • Turner, C.F., Gribbe, J.N., Al-Tayyib, A.A. and Chromy, J.R. (2002) Falsification in epidemiologic surveys: Detection and remediation, Technical Papers on Health and Behavior Measurement, No. 53. Washington, DC: Research Triangle Institute.

    • Search Google Scholar
    • Export Citation
  • Werker, H.F. (1981) Results of the 1980 US census challenged, Population and Development Review, 7(1): 15567. doi: 10.2307/1972793

  • Yamamoto, K. and Lennon, M.L. (2018) Understanding and detecting data fabrication in large-scale assessments, Quality Assurance in Education, 26(2): 196212. doi: 10.1108/QAE-07-2017-0038

    • Search Google Scholar
    • Export Citation
Michael BergmannMunich Center for the Economics of Aging (MEA), Max Planck Institute for Social Law and Social Policy, Germany

Search for other papers by Michael Bergmann in
Current site
Google Scholar
PubMed
Close
,
Karin SchullerMunich Center for the Economics of Aging (MEA), Max Planck Institute for Social Law and Social Policy, Germany

Search for other papers by Karin Schuller in
Current site
Google Scholar
PubMed
Close
, and
Frederic MalterMunich Center for the Economics of Aging (MEA), Max Planck Institute for Social Law and Social Policy, Germany

Search for other papers by Frederic Malter in
Current site
Google Scholar
PubMed
Close

Content Metrics

May 2022 onwards Past Year Past 30 Days
Abstract Views 69 69 14
Full Text Views 12 12 0
PDF Downloads 10 10 0

Altmetrics

Dimensions