Interviewer effects on the reporting of intimate partner violence in the 2015 Zimbabwe Demographic and Heath Survey

View author details View Less
  • 1 St. Michael’s Hospital, Canada
  • | 2 University of Michigan School of Nursing, USA
Full Access
Get eTOC alerts
Rights and permissions Cite this article

Intimate partner violence is a global public health concern that is widely under-reported. Socio-demographic factors of the interviewer may contribute to a reluctance to report violence. The introduction of the fieldworker survey to the 2015 Zimbabwe Demographic and Health Survey provides the first opportunity to test associations between interviewer characteristics and the reporting of intimate partner violence in the largest source of IPV data on intimate partner violence available for low- and middle-income countries. Three separate, multilevel logistic regression models were used to examine associations between the reporting of physical, sexual and emotional intimate partner violence and interviewer characteristics (age, sex and marital status, as well as differences in these indicators between interviewer and respondent), language of the interview and the interviewer’s previous experience conducting the Demographic and Health Survey. Previous experience as a Demographic and Health Survey interviewer was associated with significantly lower odds (OR: 0.67) of reporting physical intimate partner violence. Researchers should consider using the fieldworker data set in future studies to control for potential interviewer error, account for the clustering of data by interviewer and increase the robustness of Demographic and Health Survey analyses. Understanding how interviewers may shape the reporting of intimate partner violence is a step towards accurately measuring its burden in low- and middle-income countries.

Abstract

Intimate partner violence is a global public health concern that is widely under-reported. Socio-demographic factors of the interviewer may contribute to a reluctance to report violence. The introduction of the fieldworker survey to the 2015 Zimbabwe Demographic and Health Survey provides the first opportunity to test associations between interviewer characteristics and the reporting of intimate partner violence in the largest source of IPV data on intimate partner violence available for low- and middle-income countries. Three separate, multilevel logistic regression models were used to examine associations between the reporting of physical, sexual and emotional intimate partner violence and interviewer characteristics (age, sex and marital status, as well as differences in these indicators between interviewer and respondent), language of the interview and the interviewer’s previous experience conducting the Demographic and Health Survey. Previous experience as a Demographic and Health Survey interviewer was associated with significantly lower odds (OR: 0.67) of reporting physical intimate partner violence. Researchers should consider using the fieldworker data set in future studies to control for potential interviewer error, account for the clustering of data by interviewer and increase the robustness of Demographic and Health Survey analyses. Understanding how interviewers may shape the reporting of intimate partner violence is a step towards accurately measuring its burden in low- and middle-income countries.

Key messages

  • This is the first study to measure interviewer effects regarding the reporting of intimate partner violence in the Demographic and Health Survey.

  • Previous experience conducting the Demographic and Health Survey was significantly associated with lower odds of a respondent reporting physical intimate partner violence.

  • Using the fieldworker data set will help improve the rigour of Demographic and Health Survey analyses and identify interviewer effects in other countries.

Introduction

Globally, one in three women will report intimate partner violence (IPV) over their life course (World Health Organization, 2017). In many low- and middle-income countries (LMICs), the rate of reported IPV is much higher, ranging from an average of 24.6 per cent in Western Pacific countries to 37.7 per cent in South-East Asia (World Health Organization, 2013). IPV is associated with negative health outcomes through physical trauma, increased stress and anxiety, and the fear and control that often accompany abusive relationships (World Health Organization, 2013: 2017). The effects of IPV are also shown to extend well beyond the victim of abuse, negatively affecting the entire family unit (Campbell, J., et al, 2002; Campbell et al, R., 2009; World Health Organization, 2013).

An accurate measure of prevalence is foundational to the development of effective interventions and policies for the primary and secondary reduction of IPV. However, IPV is thought to be widely under-reported (Ellsberg et al, 2001; Garcia-Moreno et al, 2006; Palermo et al, 2014). Many women are reluctant or unable to report violence due to feelings of shame, the fear of being blamed, a reluctance to be seen as disloyal or fear for their safety (Ellsberg et al, 2001; Jewkes et al, 2002; Palermo et al, 2014). Women may not categorise the acts that they experience as violence, particularly in conservative, patriarchal societies in which IPV is often normalised (Stephenson et al, 2008; Palermo et al, 2014). In a global study of women in ten LMICs, only 34–79 per cent of women who reported physical IPV had ever disclosed their experiences to anyone, fewer than 10 per cent had reported it to law enforcement and fewer than 6 per cent had reported it to medical services (Garcia-Moreno et al, 2006).

In addition to sociocultural reasons for the under-reporting of IPV, the socio-demographic characteristics of the person collecting data may influence the reporting of IPV (Ellsberg et al, 2001). Known as interviewer error, studies since the late 1960s show that interviewer characteristics such as gender, race, age and previous experience as an interviewer can influence how someone responds to a sensitive survey question (Schuman and Converse, 1971; Krysan and Couper, 2003; McGlone et al, 2006; Davis et al, 2009; Visschers et al, 2017).

Another driver of error in surveys comes from the subjects’ social desirability bias, that is, responding to an interviewer’s question in a way that paints them in a positive light (Edwards, 1953). Although it has been studied since the 1950s, more recent evidence points to two main dimensions of social desirability bias: impression management and self-deception (Paulhus, 2001; Ventimiglia and MacDonald, 2012). Impression management refers to respondents who self-scrutinise and edit their responses based on a desire to present a positive image or to avoid revealing a stigmatised health behaviour (Perinelli and Gremigni, 2016; Visschers et al, 2017). This behaviour is also shaped by the respondents’ cultural environment and the social scripts that they feel they should conform to. Self-deception occurs when respondents deny their participation in stigmatised behaviours (Ventimiglia and MacDonald, 2012). Regarding IPV, there is evidence from Europe that social desirability bias is a conscious process among survey respondents but that it largely stems from impression management and not self-deception (Visschers et al, 2017).

In LMICs, social desirability bias is especially common when the interviewer is perceived as a stranger (Weinreb, 2006) or deemed to be an ‘outsider’ to one or more demographic groups (Davis et al, 2009; Milligan, 2016). To reduce interviewer error, many surveys have moved beyond face-to-face survey administration, making use of web-delivered surveys (Stephenson et al, 2013; Chard et al, 2015) or computer-assisted self-interviewing (CASI) techniques (Tufts University School of Medicine, no date). These methods have been shown to reduce social desirability bias and interviewer error (Hines et al, 2010) but have not yet been implemented in the Zimbabwe Demographic and Health Surveys (DHS).

These dual drivers of bias, collectively known as interviewer effects, have the potential to bias data collection and analysis. If respondents feel that they cannot truthfully answer a particular question due to either interviewer error or desirability bias, the mean, variance, standard deviation and standard errors of the question will be altered. This has the potential to significantly change statistical associations between variables (Davis, 1997; Groves, 2004; Davis et al, 2009). Each interviewer commonly interviews multiple respondents, resulting in the clustering of respondents by interviewer and compounding the effects of social desirability bias. While some IPV studies have attempted to mitigate interviewer effects using multilevel modelling techniques (Jewkes et al, 2002: 2003), accounting for the clustering of respondents by interviewer, this only identities that bias exists; it does not identify which characteristics of the interviewer are creating the bias.

The DHS represents the largest source of information on IPV in LMICs. Consisting of data on 90 countries across more than 30 years, the DHS is a comprehensive, publicly available source of data across a range of biological, behavioural and social indicators (ICF International, 2017b). The DHS uses a standard interviewer training manual across all countries that includes both didactic training and practical data-collection experience (ICF International, 2017a). Classroom training emphasises the sensitivity of data collection, the imperative for privacy and the importance of confidentiality for all modules. It also reviews each section of the three major surveys (Household, Individual and Biomarker) in detail and provides demonstration interviews by experienced interviewer-trainers. Trainees also participate in role-play sessions before conducting supervised practice interviews with real households. Potential interviewers are then given written tests, with final selection as a DHS fieldworker based on their successful completion (ICF International, 2017a). Despite rigorous training for interviewers and a global reliance on the DHS for IPV data in LMICs, there has been no mechanism by which to assess interviewer error in the DHS. The announcement that a survey of interviewers had been implemented in the 2015 Zimbabwe DHS signals the first opportunity to quantitatively test associations between interviewer characteristics, differences in characteristics between interviewer and respondent, and DHS variables (Kishor et al, 2017). This study aims to fill a gap in the literature by being the first to examine interviewer error in the DHS as it pertains to the reporting of physical, sexual and emotional IPV.

Methods

This analysis combined two surveys from the 2015 Zimbabwe DHS: the individual survey of women aged 15–49 (n = 9,955) and the fieldworker (interviewer) data set (n = 120). For the women’s data set, the DHS first used the most recent Zimbabwean census data to create geographic demarcations called Primary Sampling Units (PSUs). A total of, 20 to 30 households were interviewed from each PSU, and approximately 73 per cent of these respondents were randomly selected to answer the Domestic Violence Module. Of these, only ever-married women were asked questions about current or past IPV (n = 5,522). All interviewers trained to collect the 2015 DHS women’s survey were interviewed by DHS staff. The data from the women’s and fieldworker surveys were merged using the fieldworker identification code (v028), resulting in a data set in which every line is a woman aged 15–49 and the characteristics of the interviewer are included on the line of each woman that they interviewed.

Outcomes

Three outcome variables measured the reported lifetime prevalence of three types of IPV and were coded 1 if the respondent indicated that her husband or male partner had ever committed physical violence (pushed, shook or threw something; slapped, punched or kicked; attempted to strangle or burn her; twisted her arm or pulled her hair; or threatened with a knife or gun), sexual violence (partner physically forced sex when not wanted; ever forced other sexual acts when not wanted) or emotional violence (humiliated in public, threatened harm, insulted her or made her feel badly) against her after the age of 15.

Key covariates

Eight key covariates measuring the characteristics of the interviewer were created using data imported from the fieldworker survey. In addition to the interviewer’s age, sex and marital status, four covariates measured differences between the interviewer and respondent. These variables were binary, coded 1 if the age difference between the respondent and interviewer was greater than five years in either direction, and if marital status, sex or home region of Zimbabwe were different between a respondent and her DHS interviewer. To incorporate a measure of ethnic identity, the language of the interview was also included and coded 1 if the interview was conducted in a language other than English. Zimbabwe has 16 official languages, with English serving most often as the second language and lingua franca (DeVere, 2017). Since more than 90 per cent of Zimbabweans speak either Shona or Ndebele as a native language (but few speak both), the ethnic identities of a respondent and interviewer can reasonably be assumed to be the same if the interview was conducted in a non-English language. The variables were chosen to capture differences between the respondent and interviewer that may shape the participants’ perceptions of interviewers as outsiders.

Control variables

Explanatory model development took a theoretical approach to the existing literature on IPV in Southern Africa. Using an IPV-specific version of the social-ecological model (Heise, 1998), models included all eight key covariates, as well as individual-, household- and community-level variables as fixed (level one) effects. Specific covariates were chosen to reflect those previously used in studies of IPV in Southern Africa (Jewkes et al, 2003: 2010; Burgard and Lee-Rife, 2009; Decker et al, 2014). This approach helps couch this study within the existing literature to highlight the potential for interviewer effects. The respondents’ level of education, age at first cohabitation, place of residence (rural versus urban), ideal number of children, employment status and reporting of controlling behaviour were included as individual characteristics. Furthermore, as used in previous DHS studies (Elfstrom and Stephenson, 2012; Metheny and Stephenson, 2017), a five-point additive scale of whether physical IPV is justified in any of five hypothetical scenarios (for example, ‘Do you think a man would be justified in beating his wife if she argues with him?’, ‘Do you think a man would be justified in beating his wife if she neglects their children?’) (Cronbach’s α = 0.75) and a four-point additive scale of decision-making autonomy (for example, ‘Who usually makes decisions about health care for yourself?’, ‘Who is the person who usually decides daily household purchases?’) (Cronbach’s α = 0.54) were also included (for full scale items, see Appendix 1). In measuring autonomy, original variables were collapsed into binary variables indicating whether the woman had any say in the decision before the scale was created. Wealth quintile and spousal differences in age, ideal number of children and education level between a respondent and her male partner were included as household characteristics. Following methods used in previous analyses of community-level effects using DHS data (Stephenson, 2009; Stephenson et al, 2007; Stephenson and Elfstrom, 2012; Metheny and Stephenson, 2017), community characteristics were proxied by aggregating individual-level responses to the level of the PSU. Means of IPV justification, controlling behaviour, age at marriage, decision-making autonomy, ideal number of children, household wealth, education level, female employment and dyadic differences in age, education and fertility preferences at the PSU level were included as measures of an individual’s community context. While univariate analyses did not inform model selection, a correlation matrix was used to identify possible issues of multicollinearity.

Analysis

Three two-level, multilevel logistic regression models were fit (one for each IPV outcome), using the interviewer identification code (v028) as the random intercept. In this data set, multiple respondents were assigned to each interviewer, creating a nested data structure. Multilevel modelling is required when analysing nested data to correct for the downward bias in standard errors caused by non-independent observations (Steele et al, 1996; Steele and Diamond, 1999; Diez-Roux, 2000). This approach also introduces a random error term into the regression equation, allowing the intercept to vary across interviewers and accounting for the effect of unmeasured or unmeasurable covariates (Steele and Diamond, 1999; Diez-Roux, 2000; Durrant et al, 2010). The models control for nesting with a random intercept term. Models also include community characteristics, with women who reside in the same community sharing the same characteristics. This two-way crossed classified nesting structure is accounted for by including the PSU (community) as a fixed effect in the model at level 1. Doing so accounts for the geographic nesting of women within communities while keeping the existing two-level modelling structure of women nested within interviewers (see Table 2).

Results

Overall, 30.1 per cent of women reported physical IPV, 11.6 per cent reported sexual IPV and 31 per cent reported emotional IPV. Interviewers were 28.8 years old on average, which is slightly younger than the average respondent age of 31.6 years. Only 31 respondents (0.6 per cent) were interviewed by men. Approximately one third (34.7 per cent) of interviewers had never been married or lived with a partner, and less than one fifth (19.4 per cent) had previously worked on a DHS. Nearly 58 per cent were at least five years older or younger, 79.2 per cent had a different marital status, and 17.4 per cent were from a different region of Zimbabwe than the respondent they were interviewing (see Table 1).

Table 1:

Sample characteristics of 5,522 women aged 15–49 and 120 fieldworkers from the 2015 Zimbabwe DHS

IndicatorRespondentInterviewer
Age (µ, years)31.628.8
Sex (%)
Male051.2
Female10048.8
Marital status (%)
Married80.359.7
Living with partner4.60
Divorced/separated/widowed15.15.6
Never married034.7
Highest level of education (%)
No education/primary29.61.8
Secondary63.017.1
Higher7.481.1
Place of residence (%)
Urban41.756.5
Rural58.443.3
Age at marriage (%)
≤ 1622.0
17–1825.9
19+52.1
Wealth quintile (%)
Poorest17.9
Poorer16.1
Middle15.5
Richer27.5
Richest23.0
Region of residence
Manicaland11.812.5
Mashonaland Central11.96.7
Mashonaland East10.16.7
Mashonaland West11.75.7
Matebeleland North8.85.8
Matebeleland South7.76.7
Midlands9.412.5
Masvingo11.216.7
Harare10.119.2
Bulawayo7.26.7
Interviewer comes from different region than respondent17.4
IPV justification (µ, 0–5)0.8
Ideal number of children (µ)4.1
Employment status (%)
Employed45.8100
Experienced controlling behaviour (%)66.1
Decision-making autonomy (µ, 0–3)2.6
Reported IPV (%)
Physical30.1
Sexual11.6
Emotional30.9
Interview conducted in English (%)79.679.6
Experience conducting a DHS (%)19.4

When controlling for individual-, household- and community-level effects, most interviewer characteristics were not significantly associated with the reporting of physical, sexual or emotional IPV (see Table 2). However, previous experience working on a DHS was associated with significantly lower odds of reporting physical IPV (aOR = 0.67). Results of each multilevel logistic regression model are presented in Table 2.

Table 2:

Associations between interviewer characteristics and reporting of physical, sexual and emotional IPV

Physical IPVSexual IPVEmotional IPV
Indicator (referent)Adjusted odds ratiop-value95% CIAdjusted odds ratiop-value95% CIAdjusted odds ratiop-value95% CI
Individual variables
Education
Primary0.940.8550.4811.830.860.7600.342.200.720.3400.371.41
Secondary0.860.6850.4281.740.780.6260.292.090.700.3230.351.41
Higher0.470.0710.2101.070.530.2740.171.650.440.0440.200.98
Rural0.890.0.5460.5981.311.140.6240.671.950.930.7010.631.37
Age at marriage (< 17)
17–180.820.0570.6721.010.960.9760.7541.321.010.9260.821.24
19+0.630.0000.5240.780.870.3410.661.150.840.0910.691.03
Wealth (poorest)
Poorer0.940.6420.7401.201.010.9590.721.400.890.2250.671.10
Middle0.780.0820.5941.030.730.1210.491.090.760.0540.581.00
Richer0.560.0020.3820.810.890.6610.541.490.680.0420.470.99
Richest0.420.0000.2680.660.800.4810.431.490.630.0390.400.98
IPV justification1.130.0001.061.191.110.0061.031.201.060.0491.000.12
Ideal number of children0.370.1490.921.010.930.0230.870.990.990.7040.951.04
Employed (no)0.190.0251.021.391.290.0241.031.601.130.1140.971.33
Controlling behaviour3.310.0002.773.974.640.0000.366.405.150.0004.256.24
Decision-making autonomy0.880.0160.800.980.890.0980.771.020.890.0120.790.97
Dyadic difference in education (same education level)
Wife higher education1.140.2930.891.461.210.2620.871.701.280.0451.011.63
Husband higher education0.830.0940.671.030.810.1640.561.090.860.1550.691.06
Dyadic difference in age (> five years)0.880.1030.761.031.100.4110.881.350.980.8410.851.14
Dyadic difference in fertility preferences0.690.0000.590.800.530.0000.420.660.620.0000.540.73
Community variables
Mean IPV justification0.920.5130.731.171.060.7360.761.480.980.8260.761.24
Proportion of husbands with controlling behaviour0.940.8270.551.621.600.2340.743.490.620.7300.521.58
Mean age at marriage0.900.0080.830.971.040.5170.931.151.010.7620.941.09
Mean decision-making autonomy0.800.2070.571.130.860.5210.531.380.740.0870.521.04
Mean ideal number of children1.040.6700.881.211.380.0051.101.731.060.5190.901.24
Mean wealth quintile1.230.0640.991.531.110.5050.821.511.240.0580.991.54
Mean level of female education1.020.7280.921.131.150.0551.001.330.920.1170.831.02
Proportion of women employed2.010.0271.083.730.520.1490.221.260.770.4120.411.44
Proportion of couples with different education levels0.380.0550.991.901.740.0191.102.770.910.5700.661.26
Proportion of couples with different fertility preferences1.010.9670.991.741.640.2130.753.561.260.4100.732.17
Proportion of couples with an age difference of > five years0.370.0010.210.660.550.1410.251.220.830.5240.471.46
PSU1.000.8290.991.001.000.6780.991.001.000.7010.631.37
Interviewer characteristics
Interviewer sex (male)1.900.2140.695.241.540.5840.337.121.260.6760.423.79
Interviewer experience collecting a DHS (no)0.670.0150.490.931.080.6620.771.520.860.4270.581.26
Interviewer marital status (never married)0.790.3300.491.270.580.1030.301.120.770.3440.441.33
Interviewer age1.010.8030.961.050.990.7750.941.041.010.6970.961.07
Language of interview (English)0.940.6540.7301.220.7480.1080.521.070.940.6520.721.23
Interviewer–respondent difference in marital status (no)0.930.7470.591.460.640.1820.331.230.940.8240.571.56
Interviewer–respondent difference in region (no)0.880.2970.701.120.960.8150.691.341.030.8170.811.31
Interviewer–respondent difference in age of > five years (no)0.950.1960.881.03.970.6090.871.090.960.2820.881.03
Random effect
Sigma_mu0.280.210.40
Standard error0.060.010.05

Note: CI = confidence interval.

Discussion

While the lack of significant associations between interviewer characteristics and respondent reporting of IPV in this analysis is contrary to much of the evidence on interviewer error, one potential reason for this may be the especially rigorous nature of the DHS interviewer training process. The calibre of this training process may serve to attenuate interviewer effects regarding the reporting of IPV in the short term, perhaps by increasing fidelity to the study protocol beyond that achieved by other interviewer trainings. This would be consistent with evidence from the World Health Organization (WHO) multi-country study of violence against women. In Serbia and Montenegro, pressure to quickly finish fieldwork necessitated hiring previously trained contract interviewers in addition to those trained by the WHO, which uses a training process similar to the DHS (Jansen et al, 2004). WHO-trained interviewers saw a significantly greater reporting of physical violence, sexual violence and respondent satisfaction compared to the contract interviewers, who had a greater degree of variability in the rigour of their training (Jansen et al, 2004).

The only significant interviewer effect identified were lower odds of reporting physical IPV by experienced DHS interviewers. Similar associations between greater interviewer experience, reduced interview quality and the reduced reporting of other types of sensitive data have been found in previous studies (Singer et al, 1983; Gfroerer et al, 2002; Hughes et al, 2002; Chromy et al, 2003; Olson and Peytchev, 2007; Olson and Bilgen, 2011; Park et al, 2014). While the limitations of cross-sectional data mean that the directionality and mechanisms by which interviewer experience may be associated with the reporting of physical IPV remain unknown, there are three potential explanations for this relationship.

First, the relationship between previous experience and the reporting of sensitive data may stem from differences in how people complete tasks as novices and how they complete them after gaining considerable experience. As people become more experienced at performing a particular task, the cognitive schemas utilised to implement them are moved from the more limited working (short-term) memory to the more expansive long-term memory to reduce working memory load (Kalyuga et al, 2003; Shepherd et al, 2003; Sweller et al, 2011). Consequently, the autonomic processing used to access long-term memory is associated with increases in various cognitive errors that can result in reduced task quality (Shepherd et al, 2003). Olson and colleagues (2007, 2011) found that experienced interviewers increase the speed at which they conduct interviews and are rated by respondents as less invested in the interview than novice interviewers. This may then decrease the ability to accurately collect sensitive data. Similar findings were seen in successive waves of the National Survey of Drug Use and Health (NSDUH) in the US, in which reported rates of illicit drug use were significantly lower among respondents who were interviewed by fieldworkers with previous experience administering the NSDUH (Gfroerer et al, 2002; Hughes et al, 2002; Chromy et al, 2003; Park et al, 2014). Interestingly, this relationship was not significant among interviewers who had previously served as fieldworkers for other surveys but were naive to the NSDUH (Park et al, 2014). While the rigorous interviewer training provided to DHS interviewers may serve to reduce interviewer effects in the short term, this effect may diminish over time, even as interviewers receive refresher training for each round of data collection. That is, as interviewers gain experience with a survey instrument and the task of interviewing becomes more rote, the amelioration of interviewer effects due to DHS interviewer training may wane. Along these lines, it is possible that interviewers with more experience may be better than novice interviewers at recognising the non-physical manifestations of IPV and focus on these questions to the detriment of the physical IPV questions. While this theory cannot be tested using the available data, more in-depth research with DHS interviewers may shed light on this possibility.

Second, it is possible that experienced interviewers elicit an accurate reporting of physical IPV, meaning that the association seen in this analysis represents an over-reporting of physical IPV by novice interviewers. While the over-reporting of IPV is thought to be rare (Ellsberg et al, 2001), novice interviewers are often given fewer assignments (Park et al, 2014) and tend to spend more time with each respondent (Olson and Peytchev, 2007) than experienced interviewers. These two patterns may allow more time for novice interviewers to develop a higher level of rapport with respondents than do experienced interviewers. While ostensibly a net positive in data collection, rapport can lead to a phenomenon known as respondent acquiescence, or ‘yea-saying’. Acquiescence occurs when respondents reply in the affirmative to sensitive survey questions in order to maintain a high level of rapport with their interviewer (Olson and Bilgen, 2011). More experience collecting DHS data may allow interviewers to strike a balance that allows for accurate data collection without acquiescence and the subsequent potential for over-reporting.

Lastly, the observed association between interviewer experience and physical IPV may be due to unobserved heterogeneity and not be related to interviewer experience at all. Researchers analysing the NSDUH posited that the significantly higher rates of marijuana and cocaine use reported to novice interviewers could be partially due to sampling bias, wherein novice interviewers were more likely to be assigned to lower-income, urban areas that traditionally had higher rates of illicit drug use than the higher-income, suburban or rural areas canvassed by experienced interviewers (Park et al, 2014). A similar pattern is possible for the DHS and the reporting of IPV. For example, novice interviewers in the 2015 Zimbabwe DHS were significantly more likely than experienced interviewers to interview rural (χ2 = 27.64, p < 0.000) and less educated (χ2 = 10.01, p = 0.001) respondents – two groups with a traditionally higher prevalence of IPV. This analysis controlled for these two variables; however, the same pattern is possible for other indicators that are also associated with IPV but not controlled for in the modelling. Therefore, if novice or experienced fieldworkers were to be more likely to interview respondents with characteristics that are associated with IPV but not controlled for in the models (such as the male partner’s drug/alcohol use or childhood trauma, or the respondent’s contravention of her community’s fertility norms), any significant associations could be incorrectly attributed to differences in interviewer experience. Additional demographic data on all DHS fieldworkers, as well as an ability to track experienced interviewers over successive survey phases, would allow for more robust analyses of interviewer error.

Future use of fieldworker data sets

The DHS is set to include the fieldworker data set in all future DHS surveys (Kishor et al, 2017). Fieldworker recruitment, training and survey implementation likely vary by country, warranting additional studies of how interviewer effects might shape respondents’ reporting of IPV across contexts. The advent of the fieldworker data set also gives researchers the opportunity to improve the robustness of analyses. Multilevel modelling accounts for the hierarchical nature of DHS data via the nesting of respondents within interviewers (Diez-Roux, 2000; Steele and Curtis, 2003; Durrant and Steele, 2009; Durrant et al, 2010; Stephenson et al, 2013). This approach corrects the downward bias in standard errors present when using ordinary least squares regression with non-independent observations and introduces an error term that accounts for unmeasured or unmeasurable interviewer effects (Amin et al, 2002; Luke, 2005; Clarke et al, 2015). Future DHS analyses should therefore consider using the fieldworker code as a random effect in countries where the fieldworker data set is available in order to control for any interviewer effects present in the data and provide more efficient estimates of key covariates.

Limitations

There are three main limitations to this study. The cross-sectional nature of DHS data precludes inferences of causality. Second, while half of the interviewers in the data set were male, only 31 women were asked IPV questions by a male interviewer. This may reflect an effort on the part of the DHS, or its subcontracted agency in Zimbabwe, to gender-match respondents and interviewers for this module. However, this limits the ability to detect an effect of interviewer gender in the data. Third, the language of the interview was numerically coded in the two data sets using different numbering systems and including different numbers of languages, precluding the ability to draw an exact match between the native tongue of the interviewer and the language of the interview. While the ethnic landscape of Zimbabwe allows for a binary English–non-English variable to draw an approximation, the ability to differentiate non-English languages (that is, Shona versus Ndebele) would give a better understanding of the ethnic and linguistic differences between the interviewer and respondent.

Conclusion

This is the first study to use the newly available DHS fieldworker data set to understand associations between interviewer characteristics and the reporting of IPV in Zimbabwe. Variables that measure the attitudes or knowledge of interviewers (that is, the justification of violence, attitudes towards contraception, HIV knowledge and so on) and more detailed information on the degree of previous DHS experience may help disentangle how interviewer effects and differences may be associated with the reporting of IPV. While interviewer characteristics are only one component in a complex, multilevel rationale for why IPV is thought to be under-reported, understanding how DHS fieldworkers may shape the reporting of IPV is a step towards accurately measuring its burden in LMICs.

Conflicts of interest

The authors declare that there is no conflict of interest.

References

  • Amin, S., Basu, A.M. and Stephenson, R. (2002) Spatial variation in contraceptive use in Bangladesh: looking beyond the borders, Demography, 39(4): 25167. doi: 10.1353/dem.2002.0014

    • Search Google Scholar
    • Export Citation
  • Burgard, S.A. and Lee-Rife, S.M. (2009) Community characteristics, sexual initiation, and condom use among young black South Africans. Journal of Health and Social Behavior, 50(3): 293309. doi: 10.1177/002214650905000304

    • Search Google Scholar
    • Export Citation
  • Campbell, J., Jones, A.S., Dienemann, J., Kub, J., Schollenberger, J., O’Campo, P. and Wynne, C. (2002) Intimate partner violence and physical health consequences, Archives of Internal Medicine, 162(10): 115763. doi: 10.1001/archinte.162.10.1157

    • Search Google Scholar
    • Export Citation
  • Campbell, R., Dworkin, E. and Cabral, G. (2009) An ecological model of the impact of sexual assault on women’s mental health, Trauma, Violence, and Abuse, 10(3): 22546. doi: 10.1177/1524838009334456

    • Search Google Scholar
    • Export Citation
  • Chard, A.N., Finneran, C., Sullivan, P. S. and Stephenson, R. (2015) Experiences of homophobia among gay and bisexual men: Results from a cross-sectional study in seven countries, Culture, Health & Sexuality, 17(10): 117489. doi: 10.1080/13691058.2015.1042917

    • Search Google Scholar
    • Export Citation
  • Chromy, J., Odom, D., Eyerman, J. and McNeeley, M. (2003) The effect of interviewer experience on the interview process in the national survey on drug use and health, In JSM Proceedings, Survey Research Methods Section. Alexandria, VA: American Statistical Association. (954–961).

    • Search Google Scholar
    • Export Citation
  • Clarke, P., Sproston, K. and Thomas, R. (2003) An investigation into expectation-led interviewer effects in health surveys, Social Science & Medicine, 56(10): 22218.

    • Search Google Scholar
    • Export Citation
  • Davis, D.W. (1997) Nonrandom measurement error and race of interviewer effects among African Americans, The Public Opinion Quarterly, 61(1): 183207. doi: 10.1086/297792

    • Search Google Scholar
    • Export Citation
  • Davis, R.E., Couper, M.P., Janz, N.K., Caldwell, C.H. and Resnicow, K. (2009) Interviewer effects in public health surveys, Health Education Research, 25(1): 1426. doi: 10.1093/her/cyp046

    • Search Google Scholar
    • Export Citation
  • Decker, M.R., Peitzmeier, S., Olumide, A., Acharya, R., Ojengbede, O., Covarrubias, L. and Brahmbhatt, H. (2014) Prevalence and health impact of intimate partner violence and non-partner sexual violence among female adolescents aged 15–19 years in vulnerable urban environments: a multi-country study, Journal of Adolescent Health, 55(6, Supplement): S58S67.

    • Search Google Scholar
    • Export Citation
  • DeVere (2017) Languages of Zimbabwe, www.devere-zimbabwe.co.zw/news/zimbabwe-official-languages

  • Diez-Roux, A.V. (2000) Multilevel analysis in public health research, Annual Review of Public Health, 21: 17192. doi:

  • Durrant, G.B. and Steele, F. (2009) Multilevel modelling of refusal and non-contact in household surveys: evidence from six UK government surveys, Journal of the Royal Statistical Society. Series A: Statistics in Society, 172(2): 36181. doi: 10.1111/j.1467-985X.2008.00565.x

    • Search Google Scholar
    • Export Citation
  • Durrant, G.B., Groves, R.M., Staetsky, L. and Steele, F. (2010) Effects of interviewer attitudes and behaviors on refusal in household surveys, Public Opinion Quarterly, 74(1): 136. doi: 10.1093/poq/nfp098

    • Search Google Scholar
    • Export Citation
  • Edwards, A. L. (1953) The relationship between the judged desirability of a trait and the probability that the trait will be endorsed, Journal of Applied Psychology, 37(2): 90. doi: 10.1037/h0058073

    • Search Google Scholar
    • Export Citation
  • Elfstrom, K.M. and Stephenson, R. (2012) The role of place in shaping contraceptive use among women in Africa, PLoS ONE, 7(7). doi: 10.1371/journal.pone.0040670

    • Search Google Scholar
    • Export Citation
  • Ellsberg, M., Heise, L., Peña, R., Agurto, S. and Winkvist, A. (2001) Researching domestic violence against women: methodological and ethical considerations, Studies in Family Planning, 32(1): 116. doi: 10.1111/j.1728-4465.2001.00001.x

    • Search Google Scholar
    • Export Citation
  • Garcia-Moreno, C., Jansen, H.A., Ellsberg, M., Heise, L. and Watts, C.H. (2006) Prevalence of intimate partner violence: findings from the WHO multi-country study on women’s health and domestic violence, Lancet, 368(9543): 12609. doi: 10.1016/S0140-6736(06)69523-8

    • Search Google Scholar
    • Export Citation
  • Gfroerer, J.C., Eyerman, J. and Chromy, J.R. (2002) Redesigning an ongoing national household survey: Methodological issues, Washington, DC: Department of Health and Human Services, Substance Abuse and Mental Health Services Administration, Office of Applied Studies.

    • Search Google Scholar
    • Export Citation
  • Groves, R.M. (2004) Survey errors and survey costs, Hoboken, NJ: John Wiley & Sons.

  • Heise, L.L. (1998) Violence against women: an integrated, ecological framework. Violence Against Women, 4(3): 26290. doi: 10.1177/1077801298004003002

    • Search Google Scholar
    • Export Citation
  • Hines, D.A., Douglas, E.M. and Mahmood, S. (2010) The effects of survey administration on disclosure rates to sensitive items among men: A comparison of an internet panel sample with a RDD telephone sample, Computers in Human Behavior, 26(6): 132735. doi: 10.1016/j.chb.2010.04.006

    • Search Google Scholar
    • Export Citation
  • Hughes, A., Chromy, J., Giacoletti, K. and Odom, D. (2002) Impact of interviewer experience on respondent reports of substance use, in J.C. Gfroerer, J. Eyerman and J.R. Chromy (eds) Redesigning an Ongoing National Household Survey: Methodological Issues, Washington, DC: Department of Health and Human Services, Substance Abuse and Mental Health Services Administration, Office of Applied Studies, pp 16184.

    • Search Google Scholar
    • Export Citation
  • ICF International (2017a) Demographic and Health Surveys Interviewer’s Manual, Rockville, MD: ICF.

  • ICF International (2017b) Who we are, https://dhsprogram.com/who-we-are/About-Us.cfm

  • Jansen, H.A., Watts, C., Ellsberg, M., Heise, L. and Garcia-Moreno, C. (2004) Interviewer training in the WHO multi-country study on women’s health and domestic violence, Violence Against Women, 10(7): 83149. doi: 10.1177/1077801204265554

    • Search Google Scholar
    • Export Citation
  • Jewkes, R., Levin, J. and Penn-Kekana, L. (2002) Risk factors for domestic violence: findings from a South African cross-sectional study, Social Science & Medicine, 55(9): 160317, http://dx.doi.org/10.1016/S0277-9536(01)00294-5

    • Search Google Scholar
    • Export Citation
  • Jewkes, R.K., Levin, J.B. and Penn-Kekana, L.A. (2003) Gender inequalities, intimate partner violence and HIV preventive practices: findings of a South African cross-sectional study, Social Science & Medicine, 56(1): 12534.

    • Search Google Scholar
    • Export Citation
  • Kalyuga, S., Ayres, P., Chandler, P. and Sweller, J. (2003) The expertise reversal effect, Educational Psychologist, 38(1): 2331. doi: 10.1207/S15326985EP3801_4

    • Search Google Scholar
    • Export Citation
  • Kishor, S., Elkasabi, M. and Nybro, E. (2017) A new DHS questionnaire: interviewing fieldworkers, https://blog.dhsprogram.com/fieldworkers/

    • Search Google Scholar
    • Export Citation
  • Krysan, M. and Couper, M.P. (2003) Race in the live and the virtual interview: racial deference, social desirability, and activation effects in attitude surveys, Social Psychology Quarterly, 66(4): 36483. doi: 10.2307/1519835

    • Search Google Scholar
    • Export Citation
  • Luke, D.A. (2005) Getting the big picture in community science: methods that capture context, American Journal of Community Psychology, 35(3/4): 185200. doi: 10.1007/s10464-005-3397-z

    • Search Google Scholar
    • Export Citation
  • McGlone, M.S., Aronson, J. and Kobrynowicz, D. (2006) Stereotype threat and the gender gap in political knowledge, Psychology of Women Quarterly, 30(4): 3928. doi: 10.1111/j.1471-6402.2006.00314.x

    • Search Google Scholar
    • Export Citation
  • Metheny, N. and Stephenson, R. (2017) How the community shapes unmet need for modern contraception: an analysis of 44 demographic and health surveys, Studies in Family Planning, 48(3): 23551. doi: 10.1111/sifp.12028

    • Search Google Scholar
    • Export Citation
  • Milligan, L. (2016) Insider-outsider-inbetweener? Researcher positioning, participative methods and cross-cultural educational research, Compare: A Journal of Comparative and International Education, 46(2): 23550. doi: 10.1080/03057925.2014.928510

    • Search Google Scholar
    • Export Citation
  • Olson, K. and Bilgen, I. (2011) The role of interviewer experience on acquiescence, Public Opinion Quarterly, 75(1): 99114. doi: 10.1093/poq/nfq067

    • Search Google Scholar
    • Export Citation
  • Olson, K. and Peytchev, A. (2007) Effect of interviewer experience on interview pace and interviewer attitudes, Public Opinion Quarterly, 71(2): 27386. doi: 10.1093/poq/nfm007

    • Search Google Scholar
    • Export Citation
  • Palermo, T., Bleck, J. and Peterman, A. (2014) Tip of the iceberg: reporting and gender-based violence in developing countries, American Journal of Epidemiology, 179(5): 60212. doi: 10.1093/aje/kwt295

    • Search Google Scholar
    • Export Citation
  • Park, H., Currivan, D., Wang, K., Heddon, S., Highes, A. and Painter, D. (2014) National survey on drug use and health: Summary of methodological studies, 1971–2014, No. 0212800.001.208.007.034, Rockville, MD: Substance Use and Mental Health Services Administration.

    • Search Google Scholar
    • Export Citation
  • Paulhus, D.L. (2001) Socially desirable responding: the evolution of a construct, in H. I. Braun, D. N. Jackson and D. E. Wiley (eds) The Role of Constructs in Psychological and Educational Measurement, 1st edn, Mahwah, New Jersey: Routledge, pp. 4969.

    • Search Google Scholar
    • Export Citation
  • Perinelli, E. and Gremigni, P. (2016) Use of social desirability scales in clinical psychology: a systematic review, Journal of Clinical Psychology, 72(6): 53451. doi: 10.1002/jclp.22284

    • Search Google Scholar
    • Export Citation
  • Schuman, H. and Converse, J.M. (1971) The effects of black and white interviewers on black responses in 1968, Public Opinion Quarterly, 35(1): 4468. doi: 10.1086/267866

    • Search Google Scholar
    • Export Citation
  • Shepherd, D.A., Zacharakis, A. and Baron, R.A. (2003) VCs’ decision processes: evidence suggesting more experience may not always be better, Journal of Business Venturing, 18(3): 381401. doi: 10.1016/S0883-9026(02)00099-X

    • Search Google Scholar
    • Export Citation
  • Singer, E., Frankel, M.R. and Glassman, M.B. (1983) The effect of interviewer characteristics and expectations on response, Public Opinion Quarterly, 47(1): 6883. doi: 10.1086/268767

    • Search Google Scholar
    • Export Citation
  • Steele, F. and Curtis, S.L. (2003) Appropriate methods for analyzing the effect of method choice on contraceptive discontinuation, Demography, 40(1): 122. doi: 10.1353/dem.2003.0009

    • Search Google Scholar
    • Export Citation
  • Steele, F. and Diamond, I. (1999) Contraceptive switching in Bangladesh, Studies in Family Planning, 30(4): 31528. doi: 10.1111/j.1728-4465.1999.t01-3-.x

    • Search Google Scholar
    • Export Citation
  • Steele, F., Diamond, I. and Amin, S. (1996) Immunization uptake in rural Bangladesh: a multilevel analysis, Journal of the Royal Statistical Society. Series A: Statistics in Society, 159(2): 28999. doi: 10.2307/2983175

    • Search Google Scholar
    • Export Citation
  • Stephenson, R. (2009) Community factors shaping HIV-related stigma among young people in three African countries, AIDS Care, 21(4): 40310. doi: 10.1080/09540120802290365

    • Search Google Scholar
    • Export Citation
  • Stephenson, R. and Elfstrom, K.M. (2012) Community influences on antenatal and delivery care in Bangladesh, Egypt, and Rwanda, Public Health Reports, 127(1): 96106. doi: 10.1177/003335491212700111

    • Search Google Scholar
    • Export Citation
  • Stephenson, R., Baschieri, A., Clements, S., Hennink, M. and Madise, N. (2007) Contextual influences on modern contraceptive use in sub-Saharan Africa, American Journal of Public Health, 97(7): 123340. doi: 10.2105/AJPH.2005.071522

    • Search Google Scholar
    • Export Citation
  • Stephenson, R., Koenig, M.A., Acharya, R. and Roy, T.K. (2008) Domestic violence, contraceptive use, and unwanted pregnancy in rural India, Studies in Family Planning, 39(3): 17786. doi: 10.1111/j.1728-4465.2008.165.x

    • Search Google Scholar
    • Export Citation
  • Stephenson, R., Elfstrom, K.M. and Winter, A. (2013) Community influences on married men’s uptake of HIV testing in eight African countries, AIDS and Behavior, 17(7): 235266. doi: 10.1007/s10461-012-0223-0

    • Search Google Scholar
    • Export Citation
  • Sweller, J., Ayres, P. and Kalyuga, S. (2011) The expertise reversal effect, in Cognitive load theory, New York: Springer, pp 15570.

  • Tufts University School of Medicine (n.d.) Computer-assisted self-interview (CASI), Retrieved from http://acasi.tufts.edu/casi.htm

  • Ventimiglia, M. and MacDonald, D.A. (2012) An examination of the factorial dimensionality of the marlowe crowne social desirability scale, Personality and Individual Differences, 52(4): 48791. doi: 10.1016/j.paid.2011.11.016

    • Search Google Scholar
    • Export Citation
  • Visschers, J., Jaspaert, E., and Vervaeke, G. (2017) Social desirability in intimate partner violence and relationship satisfaction reports: An exploratory analysis, Journal of Interpersonal Violence, 32(9): 140120. doi: 10.1177/0886260515588922

    • Search Google Scholar
    • Export Citation
  • Weinreb, A.A. (2006) The limitations of stranger-interviewers in Rural Kenya, American Sociological Review, 71(6): 101439. doi: 10.1177/000312240607100607

    • Search Google Scholar
    • Export Citation
  • World Health Organization (2013) Global and Regional Estimates of Violence Against Women: Prevalence and Health Effects of Intimate Partner and Non-partner Sexual Violence, Geneva: World Health Organization.

    • Search Google Scholar
    • Export Citation
  • World Health Organization (2017) Violence against women, www.who.int/mediacentre/factsheets/fs239/en/

Appendix 1: Scale items for IPV justification and decision-making autonomy

IPV justification

Stem: In your opinion, is a husband justified in hitting or beating his wife in the following situations:

  1. If she goes out without telling him?

  2. If she neglects the children?

  3. If she argues with him?

  4. If she refuses to have sex with him?

  5. If she burns the food?

Response options:

  • Yes

  • No

  • I don’t know

Decision-making autonomy

  1. Who usually makes decisions about health care for yourself: you, your (husband/partner), you and your (husband/partner) jointly, or someone else?

  2. Who usually makes decisions about making major household purchases?

  3. Who usually makes decisions about visits to your family or relatives?

Response options:

  • Respondent

  • Husband/partner

  • Respondent and husband/partner jointly

  • Someone else

  • Other

  • Amin, S., Basu, A.M. and Stephenson, R. (2002) Spatial variation in contraceptive use in Bangladesh: looking beyond the borders, Demography, 39(4): 25167. doi: 10.1353/dem.2002.0014

    • Search Google Scholar
    • Export Citation
  • Burgard, S.A. and Lee-Rife, S.M. (2009) Community characteristics, sexual initiation, and condom use among young black South Africans. Journal of Health and Social Behavior, 50(3): 293309. doi: 10.1177/002214650905000304

    • Search Google Scholar
    • Export Citation
  • Campbell, J., Jones, A.S., Dienemann, J., Kub, J., Schollenberger, J., O’Campo, P. and Wynne, C. (2002) Intimate partner violence and physical health consequences, Archives of Internal Medicine, 162(10): 115763. doi: 10.1001/archinte.162.10.1157

    • Search Google Scholar
    • Export Citation
  • Campbell, R., Dworkin, E. and Cabral, G. (2009) An ecological model of the impact of sexual assault on women’s mental health, Trauma, Violence, and Abuse, 10(3): 22546. doi: 10.1177/1524838009334456

    • Search Google Scholar
    • Export Citation
  • Chard, A.N., Finneran, C., Sullivan, P. S. and Stephenson, R. (2015) Experiences of homophobia among gay and bisexual men: Results from a cross-sectional study in seven countries, Culture, Health & Sexuality, 17(10): 117489. doi: 10.1080/13691058.2015.1042917

    • Search Google Scholar
    • Export Citation
  • Chromy, J., Odom, D., Eyerman, J. and McNeeley, M. (2003) The effect of interviewer experience on the interview process in the national survey on drug use and health, In JSM Proceedings, Survey Research Methods Section. Alexandria, VA: American Statistical Association. (954–961).

    • Search Google Scholar
    • Export Citation
  • Clarke, P., Sproston, K. and Thomas, R. (2003) An investigation into expectation-led interviewer effects in health surveys, Social Science & Medicine, 56(10): 22218.

    • Search Google Scholar
    • Export Citation
  • Davis, D.W. (1997) Nonrandom measurement error and race of interviewer effects among African Americans, The Public Opinion Quarterly, 61(1): 183207. doi: 10.1086/297792

    • Search Google Scholar
    • Export Citation
  • Davis, R.E., Couper, M.P., Janz, N.K., Caldwell, C.H. and Resnicow, K. (2009) Interviewer effects in public health surveys, Health Education Research, 25(1): 1426. doi: 10.1093/her/cyp046

    • Search Google Scholar
    • Export Citation
  • Decker, M.R., Peitzmeier, S., Olumide, A., Acharya, R., Ojengbede, O., Covarrubias, L. and Brahmbhatt, H. (2014) Prevalence and health impact of intimate partner violence and non-partner sexual violence among female adolescents aged 15–19 years in vulnerable urban environments: a multi-country study, Journal of Adolescent Health, 55(6, Supplement): S58S67.

    • Search Google Scholar
    • Export Citation
  • DeVere (2017) Languages of Zimbabwe, www.devere-zimbabwe.co.zw/news/zimbabwe-official-languages

  • Diez-Roux, A.V. (2000) Multilevel analysis in public health research, Annual Review of Public Health, 21: 17192. doi:

  • Durrant, G.B. and Steele, F. (2009) Multilevel modelling of refusal and non-contact in household surveys: evidence from six UK government surveys, Journal of the Royal Statistical Society. Series A: Statistics in Society, 172(2): 36181. doi: 10.1111/j.1467-985X.2008.00565.x

    • Search Google Scholar
    • Export Citation
  • Durrant, G.B., Groves, R.M., Staetsky, L. and Steele, F. (2010) Effects of interviewer attitudes and behaviors on refusal in household surveys, Public Opinion Quarterly, 74(1): 136. doi: 10.1093/poq/nfp098

    • Search Google Scholar
    • Export Citation
  • Edwards, A. L. (1953) The relationship between the judged desirability of a trait and the probability that the trait will be endorsed, Journal of Applied Psychology, 37(2): 90. doi: 10.1037/h0058073

    • Search Google Scholar
    • Export Citation
  • Elfstrom, K.M. and Stephenson, R. (2012) The role of place in shaping contraceptive use among women in Africa, PLoS ONE, 7(7). doi: 10.1371/journal.pone.0040670

    • Search Google Scholar
    • Export Citation
  • Ellsberg, M., Heise, L., Peña, R., Agurto, S. and Winkvist, A. (2001) Researching domestic violence against women: methodological and ethical considerations, Studies in Family Planning, 32(1): 116. doi: 10.1111/j.1728-4465.2001.00001.x

    • Search Google Scholar
    • Export Citation
  • Garcia-Moreno, C., Jansen, H.A., Ellsberg, M., Heise, L. and Watts, C.H. (2006) Prevalence of intimate partner violence: findings from the WHO multi-country study on women’s health and domestic violence, Lancet, 368(9543): 12609. doi: 10.1016/S0140-6736(06)69523-8

    • Search Google Scholar
    • Export Citation
  • Gfroerer, J.C., Eyerman, J. and Chromy, J.R. (2002) Redesigning an ongoing national household survey: Methodological issues, Washington, DC: Department of Health and Human Services, Substance Abuse and Mental Health Services Administration, Office of Applied Studies.

    • Search Google Scholar
    • Export Citation
  • Groves, R.M. (2004) Survey errors and survey costs, Hoboken, NJ: John Wiley & Sons.

  • Heise, L.L. (1998) Violence against women: an integrated, ecological framework. Violence Against Women, 4(3): 26290. doi: 10.1177/1077801298004003002

    • Search Google Scholar
    • Export Citation
  • Hines, D.A., Douglas, E.M. and Mahmood, S. (2010) The effects of survey administration on disclosure rates to sensitive items among men: A comparison of an internet panel sample with a RDD telephone sample, Computers in Human Behavior, 26(6): 132735. doi: 10.1016/j.chb.2010.04.006

    • Search Google Scholar
    • Export Citation
  • Hughes, A., Chromy, J., Giacoletti, K. and Odom, D. (2002) Impact of interviewer experience on respondent reports of substance use, in J.C. Gfroerer, J. Eyerman and J.R. Chromy (eds) Redesigning an Ongoing National Household Survey: Methodological Issues, Washington, DC: Department of Health and Human Services, Substance Abuse and Mental Health Services Administration, Office of Applied Studies, pp 16184.

    • Search Google Scholar
    • Export Citation
  • ICF International (2017a) Demographic and Health Surveys Interviewer’s Manual, Rockville, MD: ICF.

  • ICF International (2017b) Who we are, https://dhsprogram.com/who-we-are/About-Us.cfm

  • Jansen, H.A., Watts, C., Ellsberg, M., Heise, L. and Garcia-Moreno, C. (2004) Interviewer training in the WHO multi-country study on women’s health and domestic violence, Violence Against Women, 10(7): 83149. doi: 10.1177/1077801204265554

    • Search Google Scholar
    • Export Citation
  • Jewkes, R., Levin, J. and Penn-Kekana, L. (2002) Risk factors for domestic violence: findings from a South African cross-sectional study, Social Science & Medicine, 55(9): 160317, http://dx.doi.org/10.1016/S0277-9536(01)00294-5

    • Search Google Scholar
    • Export Citation
  • Jewkes, R.K., Levin, J.B. and Penn-Kekana, L.A. (2003) Gender inequalities, intimate partner violence and HIV preventive practices: findings of a South African cross-sectional study, Social Science & Medicine, 56(1): 12534.

    • Search Google Scholar
    • Export Citation
  • Kalyuga, S., Ayres, P., Chandler, P. and Sweller, J. (2003) The expertise reversal effect, Educational Psychologist, 38(1): 2331. doi: 10.1207/S15326985EP3801_4

    • Search Google Scholar
    • Export Citation
  • Kishor, S., Elkasabi, M. and Nybro, E. (2017) A new DHS questionnaire: interviewing fieldworkers, https://blog.dhsprogram.com/fieldworkers/

    • Search Google Scholar
    • Export Citation
  • Krysan, M. and Couper, M.P. (2003) Race in the live and the virtual interview: racial deference, social desirability, and activation effects in attitude surveys, Social Psychology Quarterly, 66(4): 36483. doi: 10.2307/1519835

    • Search Google Scholar
    • Export Citation
  • Luke, D.A. (2005) Getting the big picture in community science: methods that capture context, American Journal of Community Psychology, 35(3/4): 185200. doi: 10.1007/s10464-005-3397-z

    • Search Google Scholar
    • Export Citation
  • McGlone, M.S., Aronson, J. and Kobrynowicz, D. (2006) Stereotype threat and the gender gap in political knowledge, Psychology of Women Quarterly, 30(4): 3928. doi: 10.1111/j.1471-6402.2006.00314.x

    • Search Google Scholar
    • Export Citation
  • Metheny, N. and Stephenson, R. (2017) How the community shapes unmet need for modern contraception: an analysis of 44 demographic and health surveys, Studies in Family Planning, 48(3): 23551. doi: 10.1111/sifp.12028

    • Search Google Scholar
    • Export Citation
  • Milligan, L. (2016) Insider-outsider-inbetweener? Researcher positioning, participative methods and cross-cultural educational research, Compare: A Journal of Comparative and International Education, 46(2): 23550. doi: 10.1080/03057925.2014.928510

    • Search Google Scholar
    • Export Citation
  • Olson, K. and Bilgen, I. (2011) The role of interviewer experience on acquiescence, Public Opinion Quarterly, 75(1): 99114. doi: 10.1093/poq/nfq067

    • Search Google Scholar
    • Export Citation
  • Olson, K. and Peytchev, A. (2007) Effect of interviewer experience on interview pace and interviewer attitudes, Public Opinion Quarterly, 71(2): 27386. doi: 10.1093/poq/nfm007

    • Search Google Scholar
    • Export Citation
  • Palermo, T., Bleck, J. and Peterman, A. (2014) Tip of the iceberg: reporting and gender-based violence in developing countries, American Journal of Epidemiology, 179(5): 60212. doi: 10.1093/aje/kwt295

    • Search Google Scholar
    • Export Citation
  • Park, H., Currivan, D., Wang, K., Heddon, S., Highes, A. and Painter, D. (2014) National survey on drug use and health: Summary of methodological studies, 1971–2014, No. 0212800.001.208.007.034, Rockville, MD: Substance Use and Mental Health Services Administration.

    • Search Google Scholar
    • Export Citation
  • Paulhus, D.L. (2001) Socially desirable responding: the evolution of a construct, in H. I. Braun, D. N. Jackson and D. E. Wiley (eds) The Role of Constructs in Psychological and Educational Measurement, 1st edn, Mahwah, New Jersey: Routledge, pp. 4969.

    • Search Google Scholar
    • Export Citation
  • Perinelli, E. and Gremigni, P. (2016) Use of social desirability scales in clinical psychology: a systematic review, Journal of Clinical Psychology, 72(6): 53451. doi: 10.1002/jclp.22284

    • Search Google Scholar
    • Export Citation
  • Schuman, H. and Converse, J.M. (1971) The effects of black and white interviewers on black responses in 1968, Public Opinion Quarterly, 35(1): 4468. doi: 10.1086/267866

    • Search Google Scholar
    • Export Citation
  • Shepherd, D.A., Zacharakis, A. and Baron, R.A. (2003) VCs’ decision processes: evidence suggesting more experience may not always be better, Journal of Business Venturing, 18(3): 381401. doi: 10.1016/S0883-9026(02)00099-X

    • Search Google Scholar
    • Export Citation
  • Singer, E., Frankel, M.R. and Glassman, M.B. (1983) The effect of interviewer characteristics and expectations on response, Public Opinion Quarterly, 47(1): 6883. doi: 10.1086/268767

    • Search Google Scholar
    • Export Citation
  • Steele, F. and Curtis, S.L. (2003) Appropriate methods for analyzing the effect of method choice on contraceptive discontinuation, Demography, 40(1): 122. doi: 10.1353/dem.2003.0009

    • Search Google Scholar
    • Export Citation
  • Steele, F. and Diamond, I. (1999) Contraceptive switching in Bangladesh, Studies in Family Planning, 30(4): 31528. doi: 10.1111/j.1728-4465.1999.t01-3-.x

    • Search Google Scholar
    • Export Citation
  • Steele, F., Diamond, I. and Amin, S. (1996) Immunization uptake in rural Bangladesh: a multilevel analysis, Journal of the Royal Statistical Society. Series A: Statistics in Society, 159(2): 28999. doi: 10.2307/2983175

    • Search Google Scholar
    • Export Citation
  • Stephenson, R. (2009) Community factors shaping HIV-related stigma among young people in three African countries, AIDS Care, 21(4): 40310. doi: 10.1080/09540120802290365

    • Search Google Scholar
    • Export Citation
  • Stephenson, R. and Elfstrom, K.M. (2012) Community influences on antenatal and delivery care in Bangladesh, Egypt, and Rwanda, Public Health Reports, 127(1): 96106. doi: 10.1177/003335491212700111

    • Search Google Scholar
    • Export Citation
  • Stephenson, R., Baschieri, A., Clements, S., Hennink, M. and Madise, N. (2007) Contextual influences on modern contraceptive use in sub-Saharan Africa, American Journal of Public Health, 97(7): 123340. doi: 10.2105/AJPH.2005.071522

    • Search Google Scholar
    • Export Citation
  • Stephenson, R., Koenig, M.A., Acharya, R. and Roy, T.K. (2008) Domestic violence, contraceptive use, and unwanted pregnancy in rural India, Studies in Family Planning, 39(3): 17786. doi: 10.1111/j.1728-4465.2008.165.x

    • Search Google Scholar
    • Export Citation
  • Stephenson, R., Elfstrom, K.M. and Winter, A. (2013) Community influences on married men’s uptake of HIV testing in eight African countries, AIDS and Behavior, 17(7): 235266. doi: 10.1007/s10461-012-0223-0

    • Search Google Scholar
    • Export Citation
  • Sweller, J., Ayres, P. and Kalyuga, S. (2011) The expertise reversal effect, in Cognitive load theory, New York: Springer, pp 15570.

  • Tufts University School of Medicine (n.d.) Computer-assisted self-interview (CASI), Retrieved from http://acasi.tufts.edu/casi.htm

  • Ventimiglia, M. and MacDonald, D.A. (2012) An examination of the factorial dimensionality of the marlowe crowne social desirability scale, Personality and Individual Differences, 52(4): 48791. doi: 10.1016/j.paid.2011.11.016

    • Search Google Scholar
    • Export Citation
  • Visschers, J., Jaspaert, E., and Vervaeke, G. (2017) Social desirability in intimate partner violence and relationship satisfaction reports: An exploratory analysis, Journal of Interpersonal Violence, 32(9): 140120. doi: 10.1177/0886260515588922

    • Search Google Scholar
    • Export Citation
  • Weinreb, A.A. (2006) The limitations of stranger-interviewers in Rural Kenya, American Sociological Review, 71(6): 101439. doi: 10.1177/000312240607100607

    • Search Google Scholar
    • Export Citation
  • World Health Organization (2013) Global and Regional Estimates of Violence Against Women: Prevalence and Health Effects of Intimate Partner and Non-partner Sexual Violence, Geneva: World Health Organization.

    • Search Google Scholar
    • Export Citation
  • World Health Organization (2017) Violence against women, www.who.int/mediacentre/factsheets/fs239/en/

  • 1 St. Michael’s Hospital, Canada
  • | 2 University of Michigan School of Nursing, USA

Content Metrics

May 2022 onwards Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 37 37 10
PDF Downloads 32 32 9

Altmetrics

Dimensions