Mixed mode research

Mixed-mode surveys) in survey projects in order to capitalize on (1) greater potential response rates and/or (2) improved coverage of the population of interest. 1]” by offering more than one mode of data collection, researchers attempt to improve the attractiveness of the survey to the respondent. Research has illustrated that the public has varied preferences for different modes of survey data collection. For example, in a 2006 mra telephone survey,[2]  adult americans indicated which mode of data collection they would select first if they were given the choice. The results, displayed in the table below, illustrate the variety in respondent preferences for survey : 2006 research profession image study results  (published 2007). Mode of data approaching respondents with more than one mode of data collection, the researcher attempts to raise the appeal of participation to the respondent. However, the success of this technique in boosting the survey response rate appears to depend on the manner in which the respondent is offered each mode. Presenting multiple survey modes in sequence has been demonstrated as a successful way to improve response rates. By contrast, research has generally not shown that offering the respondent a choice of modes up-front will boost cooperation. Second major advantage offered by mixed-mode survey designs is an increase in the coverage of the population of interest. As technologies and modes of communication have changed over-time, there have been increased options for contacting individuals and organizations. Utilizing multiple modes of data collection may offer the researcher the opportunity to include more cases in his/her sample advantages mixed-modes may offer include a reduction of cost/increased efficiency, the establishment of credibility and trust with the respondent and an improvement in the degree of privacy offered to the respondent. De leeuw, 2005)  problematically, however, since each mode of data collection is unique in terms of the transmission of information and the environment of the interview, survey questions are received and processed by respondents in different ways. Mode effects” are an important downside to the advantage(s) offered by a mixed-mode design. Mode effects” refer to the difference(s) in the way that a respondent may answer questions through one mode of survey data collection (e. Clarissa david, in “polling america: an encyclopedia of public opinion,” defines the scope of mode effects: “the term “mode effects” does not include all effects or errors caused by mode, but rather it refers more specifically to certain types of systematic errors in survey response... Of mode-related response errors are survey satisficing, social desirability, concurrent validity, recency and primacy…it should be noted that other survey errors that can be influenced by mode, such as nonresponse and sampling errors, are excluded from the usual coverage of mode effects. Defining mode effects in this manner, david highlights the nature of the problem- the linkage between the mode of data collection and the respondent’s processing of, and/or responses to, survey questions. Numerous theories have been developed as to why respondent answers may differ from one mode to the next. 4]  further, evidence suggest that mode effects are a legitimate threat to the data quality of surveys. As such, several prudent steps may be advisable for concerned researchers when considering the use of mixed-mode designs:Consider mode effects in design phase of researcher should weigh the advantages and disadvantages (including data quality & financial considerations) of mixed-mode a design before electing to utilize multiple modes. If mixed-mode design is selected, determine questionnaire development strategy to minimize likelihood of mode ional questionnaire design involves crafting questions to fit the mode of data collection being used. If mixed-mode design is implemented, examine data for modal differences have occurred between modes, consider multiple explanations.

It may not be possible to determine the source of error, but the research should investigate the problem by considering all possible causes. If mode effects are likely cause of error, attempt to discern probable causes and er the explanations presented in the appendix of this document (tables 1, 3 and annotated bibliography) as well as other sources to determine whether there are plausible explanations for the error. Description of all sources of error will help the end-user of that research to evaluate the results. Further, researcher should justify their decision of whether or not they combined the data from separate modes together for 1: theories of mode ons tend to be asked in different ways for different modes. Different measurements may result in different ” table “3”  for summary of modal tendencies in question formatting and ent modes provide different context. Aural modes do not present the respondent with a visualization of the question and response options, the respondent is forced to rely on their short-term, working memory. Administered modes may result in “social desirability bias,” where the respondent feels the need to respond in a socially acceptable manner to avoid judgment. Agree” option on agree/disagree scale) in interviewer-administered surveys since the presence of an interviewer mirrors typical everyday social iewer-administered modes may be subject to more cognitive processing, more thoughtfulness, and less satisficing than self-administered modes. However, in self-administered modes, the respondent controls the pace and may skip around or advance through the interview with less engagement and thoughtfulness. Notably, a counter argument is that self-administered modes may encourage more cognitive engagement, since the respondent has the ability to answer questions at their own pace and will not feel pressured to avoid awkward silences that pauses on the telephone/in-person to fully weigh the question would iewer-administered modes may result in less item non-response and responses with greater detail. A counter argument to this theory is that self-administered modes may offer the respondent more privacy to answer sensitive iewer and self-administered modes may differ in encouraging the respondent’s access to their memory. In self-administered modes, the respondent may consult with records and may feel free to spend more time on a given question absent the presence of an dent’s to telephone surveys may feel pressure to provide shorter responses to open-ended questions than in face-to-face (and perhaps other) surveys (bowling, 2005). This may encourage telephone respondents to give comparatively shorter answers to open-ended 2: questionnaire development ons are crafted to optimally fit the mode of data collection in ons are crafted to deliver equivalent question stimulus between modes, but may not be optimal for either or both modes. Summarized from “survey mode as a source of instability in responses across surveys” by don a. More screening/branching as a result of simplified questions, shorter surveys than face-to-face (easier to break-off interview than face-to-face), interviewer ability to probe, ability to hide “don’t know” and “refusal” options unless ctive voice short/brief surveys (very easy to break-off interview), brief wording and shorter nce of branching questions and tendency for relatively more complex question that incorporate the several dimensions of the simple questions, fully-labeled scale options, tendency to use ‘check-all-that-apply’ format over yes/no format, open-ends less attractive since no interviewer present to probe, open-ends often broken into multiple parts, clear choice (or lackthereof) offered for “don’t know” or “refusal” options, visual context present, ability for respondent to see previous and latter questions, potentially more complex questions due to ability to complete at own pace, see other questions, consult with records, c features possible (sound, video, color, unique formats), choice of whether previous and latter questions visible (as divided by screen) and the context (or lackthereof) this creates, choice of whether to allow respondent to re-visit previous questions, tendency to use check-all-that-apply question formats over yes/no, radio buttons and check boxes as standard options, longer, fully-labeled scales, branching facilitated for questionnaire by programming, pressure for survey to be short, choice to force question to be answered or to allow respondent to skip ahead, choice to overtly present “don’t know” or “refusal” 4: evidence of common mode effects reported in survey desirability desirability bias has been found in interviewer-administered modes of data collection compared to self-administered modes, where interviewer-administered modes tend to elicit responses in the direction of the social desirable answers. In another comparison, web respondents illustrated no differences compared to telephone in terms of acquiescence or attitudes and support for scientific research (fricker, 2005). And presser conduct a variety of experiments to test response-order effects in telephone research (schuman 1996). Results were mixed and the authors had difficulty predicting response order effects in advance of the one experiment, researchers found that scale formats and color contrast can affect survey responses. Tourangeau, 2007)  this research demonstrates the power of the context that formatting can create on survey responses, particularly in cases where the respondent is not given overt and clear meaning from the question and response option alone (and may fall back on other cues). Administered modes of data collection may tend to result in higher incidence of item non-response, but the results appear to be somewhat mixed. Krueter, 2008)  further, a telephone interviewer-administered mode produced more item non-response than telephone self-administered for one study, except for a question asking a racial related question in which case the opposite was true (harmon 2009). Respondents have been shown to differentiate in their responses to survey questions less than in face-to-face modes (heerwegh, 2008) and telephone modes (fricker, 2009). Comparisons of telephone and postal survey modes on respiratory symptoms and risk factors” practice of epidemiology 155: ed telephone versus mail respondents by first interviewing a large group (15-70 years in age) in areas in norway by mail, then later following up with a re-interview of a small subsample of respondents.

Differences between modes: the postal survey resulted in a ‘significantly larger percentage of respondents with “morning cough” than the telephone survey, there were significant differences between the modes in dyspnea (without a clear pattern), there was significantly less passive smoking in the home and at work reported by postal survey takers, and there were small differences for smoking, education, and episodes of phlem and cough. Similarities between modes: the authors note that there were no differences for pack-years, number of siblings and number of siblings with asthma. There were “small changes of little significance” with height and authors report that there was no “global test of tendency to report more symptoms or diagnosis with one survey mode. The effects of mode and format on answers to scalar questions in telephone and web surveys. Ed mode effects by utilizing 6 survey versions (3 telephone, 3 web) to washington state university (pullman campus) undergraduate respondents. Telephone respondents tend to give more positive answers (mean) overall to the 11-point scales (worst possible/best possible), but do not demonstrate a statistically significant higher percentage of extreme positive category authors' note that telephone research has been demonstrated to generate more positive responses regardless of whether compared to web, mail or face-to-face. Response rate and measurement differences in mixed-mode surveys using mail, telephone, interactive voice response (ivr) and the internet. Social science research 38: ed telephone, mail, web, and ivr surveys for mode effects utilizing a "quasi-general population" sample on satisfaction questions towards long distance telephone services. Questions were formatted similarly between modes and non-response was analyzed by comparing demographics of non-respondents to respondents. Further, authors' used follow-up with mixed modes to examine possible differences between early & late responders. Findings include:1)       mode effects and “no opinion/don’t know:” based on earlier research geared at delivering an equivalent stimulus, web respondents were not given the direct “no opinion” choice in the survey question. Differences between modes:  the three modes differed across the 15 question items, with the greatest differences occurring between web and in-person (authors claim an average absolute difference of between 4 & 6 percentage points) versus telephone and in-person (authors claim an absolute difference of 2 percentage points). Public opinion quarterly 69: ed telephone and web surveys for mode effects utilizing a sample of respondents with internet access drawn from a general population rdd design. Findings include:1) attitudes/support for science and scientific research were gauged using 5-point agree/disagree scales. No statistically significant differences were found between the samples of internet users who completed via web versus those that completed via telephone mode. There were no differences between the two modes in the amount of respondents displaying acqueiscing behavior (as measured in this case as the proportion of "agree" or "yes" answers to questions). A pattern showed respondent s of t-acasi collection mode to report more responses than t-iaq in the direction that would be expected given social desirability. The use of different mode produced no statistically significant differences in the composition of demographic subgroups in the samples (gender, age, marital status, education, race/ethnicity, region, urbanization and sample strata [baltimore & national]). The greatest mode effects were found among the less educated and younger respondents, also from the baltimore strata as compared to national. Mode effect was also driven for a question regarding corporal punishment and marijuana use among black respondents. Findings include:1)  face-to-face questions were presented on cards visually to aid the comparison to the web mode. Social desirability bias in cati, ivr, and web surveys: the effects of mode and question sensitivity” public opinion quarterly 72: ed ivr, cati, and web surveys in a comparison of social desirability bias. Experiment involved recent university of maryland college graduates assigned randomly to 1 of the 3 different modes of data collection.

The authors note that the traditional “lack of validation data forces investigators to make two assumptions in determining which mode leads to “better” results. The first assumption is that social desirability concerns leads respondents to underreport socially undesirable behaviors so that data the collection mode that yields higher levels of reporting is the more accurate one. There were no statistically significant differences between the modes in terms of validation records, meaning that statistically significant differences in non-response bias could not be found between ivr, cati and web even though response rates were different. Most differences between modes were not statistically significant, but mode sample sizes were somewhat limited ivr [410], cati [368], web [329]. The authors compared respondents of each mode against the actual records that were available for those respondents. Social desirability (over-reporting of positive attributes and under-reporting of negative attributes) was substantial across all modes. Over-reporting was a problem for all modes, though the prevalence was roughly similar in all modes. Public opinion quarterly 72: ed forced-choice "yes" or "no" question formats compared to "check-all-that-apply" question formats with a mixed mode telephone and web survey delivered to washington state university undergraduates. Evaluating mode effects on a survey of behavioral health care users” proceedings of the annual meeting of the american statistical association, august 5-9, ed telephone and mail data collection for social desirability mode effects across both general and behavioral health care. The authors transformed the items showing statistically significant differences by mode into dichotomous variables and tested them once again for differences, this time to illustrate the direction of bias. In all cases, respondents to the telephone survey modes offered responses that were more socially desirable. The authors theorized that significant differences may have been caused by the types of respondents who participated in the survey as opposed to social desirability or another type of mode effect. Specifically, they were concerned that less healthy respondents (and heavier service users) might be more likely to participate in one mode versus another, thereby biasing results. As such, they included self-health ratings and utilized reporting of service use as controls in a regression model, where the items discussed above were the dependent variables. After controlling for health and service use, only 3 variables demonstrated what the authors believe is a mode effect. Doing so had an important impact on the results of the analysis, and it raises the question of whether other projects would find similar controls to be as y, after controlling for non-response, the authors do find what they believe are mode effects for 3 items. They attribute this to social desirability, but do not address other causes of mode effects (i. It is possible that these mode effects are attributable to such geau, roger, couper, mick p. At the same time, the researchers tested different scales where respondents either received a scale ranging from -3 to +3 (w/ verbal end labels), 1 to 7 (w/ verbal end labels), verbal label end-points with no numbers, or verbal labels for all points but no numbers. Mode of questionnaire administration can have serious effects on data quality” journal of public health 27:r, jan, bakke, per, eide, geri e. Comparisons of telephone and postal survey modes on respiratory symptoms and risk factors” practice of epidemiology 155: ian, leah melani, dillman, don a. To mix or not to mix data collection modes in surveys” journal of official statistics 21:n, don a. Social desirability bias in cati, ivr, and web surveys: the effects of mode and question sensitivity” public opinion quarterly 72: n, howard and presser, stanley.

Evaluating mode effects on a survey of behavioral health care users” proceedings of the annual meeting of the american statistical association, august 5-9, geau, roger, couper, mick p. 4] see appendix,  table “1” for a summary of relevant theories as to why mode effects all best ed by the 2017 merger of casro and mra, all insights association proceeds are invested in advocacy, education and other initiatives to directly support the marketing research and analytics community. The new insights association helps empower intelligent business decisions as a voice, resource and network advancing the companies and individuals engaged in this important ies and individuals working in marketing research and analytics succeed when they have... Version of this field of mixed methods has only been widely accepted for the last decade, though researchers have long been using multiple methods, just not calling them “mixed. Mixed methods research takes advantage of using multiple ways to explore a research can be based on either or both ch problems can become research questions and/or hypotheses based on prior literature, knowledge, experience, or the research sizes vary based on methods collection can involve any technique available to retation is continual and can influence stages in the research use mixed methods? Be easy to describe and to be useful when unexpected results arise from a prior help generalize, to a degree, qualitative l in designing and validating an position research in a transformative are some weaknesses? Discrepancies between different types of designs generate unequal be difficult to decide when to proceed in sequential guidance on transformative ologist john creswell suggested a systematic framework for approaching mixed methods research. His framework involves four decisions to consider and six decisions for mixed method designs (creswell, 2003, p. Concurrent terized by: the use of a theoretical perspective reflected in the purpose or research questions of the study to guide all methodological e: to evaluate a theoretical perspective at different levels of this:like loading... Research rundowns research rundowns was made possible by support from the dewar college of education at valdosta state resource was created by dr. Biddix is assistant professor of higher education and research methodology in the department of curriculum, leadership, and is educational research? Of survey methods of survey methods & ch nging research /nsf/bls t & future lab ations and tical survey nging research issues in statistics and survey methodology at the m statement: mixed-mode survey design. Words: mixed-mode, multiple mode, multi-mode, hybrid surveys, equivalence of instruments, dual frame surveys, data quality, survey error, t for further discussion:Office of survey methods research, psb of labor statistics. Mail: m@ound and noted by de leeuw (2005), mixed-mode surveys have been in use for a long time and have become the norm in some countries as survey managers seek to use collection procedures that produce the best possible data within existing constraints of time and budget. Modes" refer to approaches used either to contact or to obtain data from survey respondents. The decision depends on a complex interplay of factors that the survey manager must contemplate, including the complexity of the information being collected, the time burden of the interview, and the sensitivity of the mixing of modes and approaches seems to be limited at times only by the creativity of survey managers. For example, certain survey activities, such as prenotification and reminder messages, can be sent using one mode, but data collection might rely on a different, but single, mode. For example, prenotification, reminder messages, and the initial questionnaire might be sent via mail and respondents given the option of reporting using mail, phone, the internet, or some other the two basic approaches to data collection (single or multiple modes), the use of multiple modes is of most interest because as de leeuw states it, mixed-mode systems with unimode data collection appear to be a "win-win" situation. On the other hand, systems with more than one mode of data collection may increase the likelihood of measurement error because the survey question may appear somewhat differently under different addition to the varied use of different technologies, widely different organizational approaches may be implemented for collecting the data. In fact, this occurs in surveys like the current population survey (cps), where 1st and 5th month interviews are typically done face-to-face, but subsequent interviews are commonly done from the interviewer's home using a laptop computer, which is, in essence, decentralized cati gh survey managers have been very creative with their use of modes, there may be a hidden price in terms of data quality. According to dillman and christian (2003) evidence exists that survey mode can affect respondent answers to questions, even when questions are worded the same. As a result, they caution that differences observed between time 1 and time 2 may be due to mode changes, rather than to any actual differences in behavior or opinion. Reporting rates and data quality differ substantially when self-administered and interviewer-administered modes of data collection are compared (turner, lessler, and gfoerer, 1992).

However, as with many methodological differences that affect attitude and opinion items or the reporting of potentially sensitive behaviors, it is not clear if the effects of using multiple data-collection modes will generalize to government establishment surveys, where the data are mostly factual in nature. Similarly, government household surveys that deal with topics such as work, education, and expenditures may be relatively immune to changes in data collection -mode survey approaches are widely used by the bureau of labor statistics (bls). Many surveys, a basic assumption appears to be that offering multiple modes of reporting makes the reporting task easier for respondents, which will lead to higher response and better quality data. In addition, if respondents can be encouraged to use the more cost-effective modes, the costs of data collection can be significantly reduced, or at least better controlled 2. However, the question of whether offering concurrent, multiple modes of responding actually leads to higher response does not seem to have a clear answer, but the evidence is much clearer that the use of sequential mixed-modes (for example, conducting a telephone followup after an initial questionnaire mailing) does lead to improved response (de leeuw, 2005). A research perspective, the impact of multiple modes on measurement error is also difficult to determine because choice of mode is often determined by the respondent either explicitly or implicitly (for example, by failing to respond via the desired mode). Therefore, self-selection can lead to differences that are confounded with the data-collection data collection modes are usually welcomed by survey managers, especially when they offer the opportunity to reduce costs or to improve the timeliness of data. In addition to providing a secure common gateway, the idcf requires that all survey applications meet internal standards for graphical user interfaces so that on-line questionnaires have the same look and previously noted, the current employment statistics (ces) program uses a variety of reporting modes, including mail, phone, fax, and the internet. Because of the low cost and improved timeliness of internet reporting, there has been a great deal of interest in encouraging increased use of this mode within the ces. Recent research conducted by rosen and gomes (2004) explored the following questions:Will web-eligible tde (touchtone data entry) respondents be willing to switch to web reporting? As it turned out, fax was the most cost-effective contact method when converting respondents from tde to web summary, the use of multiple modes of data collection is a trend that appears to be increasing, rather than decreasing. However, a recent paper by mcgrath (2005) revealed that about 42 percent of the interviews are currently being done by phone (decentralized capi), with unknown effects on data noted previously, with establishment surveys the effects of using multiple modes are assumed to be benign because the surveys tend to be much shorter than household surveys and the questions asked are factual in nature. Still, the validity of this assumption remains : what impacts do alternative data-collection modes have on data quality? There any evidence that different data collection modes result in data of differing quality for bls surveys? Survey managers increasingly encourage respondents to use the internet or other modes, what approaches to sample design and evaluation can be taken so that survey managers are able to measure the bias associated with different modes? Bureau of labor statistics | office of survey methods research, psb suite 1950, 2 massachusetts avenue, ne washington, dc ncbi web site requires javascript to tionresourceshow toabout ncbi accesskeysmy ncbisign in to ncbisign l listhealth serv resv. Pmc2323139telephone and web: mixed-mode challengejessica greene, howard speizer, and wyndy wiitalaaddress correspondence to jessica greene, ph. Livonia, information ► copyright and license information ►copyright © health research and educational trustthis article has been cited by other articles in ctobjectiveto explore the response rate benefits and data limitations of mixing telephone and web survey modes in a health-related research sources/study settingwe conducted a survey of salaried employees from one large employer in the manufacturing sector in the summer of designwe randomized 751 subjects, all of whom had previously completed a web survey, to complete a health-related survey either by telephone (with web follow-up for nonrespondents) or over the web (with telephone follow-up). Findingssurvey response rates were significantly higher for the mixed mode survey than they would have been if we fielded either an exclusively telephone or web survey (25 and 12 percent higher, respectively). Telephone respondents were also more likely to “agree” to knowledge statements and provide the same response across a series of items with similar response scales than were web sionsmixed mode telephone/web surveys can substantially boost response rates over single-mode surveys. Modal differences in responses can be minimized by handling missing data options consistently in the two modes, avoiding agree/disagree formats, and not including questions on personal lifestyle or other sensitive ds: mixed mode survey, survey mode effects, web survey, telephone surveythe challenges of conducting telephone-based survey research have increased substantially over the last decade. The promises and challenges of web survey research have been well documented (taylor 2000; couper 2000; fricker and schonlau 2002; fricker et al. Web surveys are generally less expensive, faster to administer, and can display more complex visual information than other survey modes.

Understanding the extent to which survey mode may influence responses is important, particularly given the emergence of research that mixes telephone and web data collections (olmstead, white, and sindelar 2004; satia, galanko, and rimer 2005; greene et al. Typically, researchers begin with one mode, usually the less expensive, and follow-up with nonresponders using a more costly mode (de leeuw 2005). Some offer two modes concurrently or, in longitudinal studies, switch modes from one wave of data collection to another (dillman and christian 2005). The response rate and cost benefits of telephone/web-mixed mode surveys should be considered in light of differences in response patterns that are caused by survey mode. Empirical evidence, however, is mixed over whether or not there are modal differences in acquiescing (bowling 2005). Given the similarities between the two survey modes, however, it is likely many of the differences observed between mail and telephone surveys exist between web and telephone date, only a small body of published research has compared modal differences in survey responses between the telephone and the web. As a result, disaggregating the influence of selection versus mode has been a key limitation. In sum, the influence of the web versus telephone on survey data is still unclear, and it has not been explored in depth for health-related this study, we seek to examine the benefits and challenges of conducting a mixed web and telephone survey on health issues. Using this experimental design, we examined two key research questions: (1) what is the benefit of offering two sequential modes rather than a single mode survey to response rates and the representativeness of the sample? 2) to what degree does survey mode influence responses to health-related behavior, attitude, and knowledge questions? To address the second question we test the degree to which we observe modal differences in social desirability, missing data, acquiescence, and question sprior mixed mode surveyour experimental study is part of a larger study on the influence of health insurance plan design. For the larger study, we conducted a web/telephone mixed mode baseline survey of employees at a large company during the summer of 2004. We opted for a mixed mode design over an exclusively telephone survey because of its lower cost and the potential for a higher response rate. The web was the primary survey mode we used and we conducted follow-up with nonrespondents first by e-mail (only for salaried employees, we had no e-mail addresses for hourly workers) and then by telephone (for all employees). Salaried employees in the 2004 study, we obtained a significantly higher response rate from the mixed mode approach than we would have if we had fielded an exclusively web survey (76 versus 63 percent). Given our design, it was unclear whether these differences were due to survey mode or to selection of who completed the survey on the web versus telephone. Telephone follow-up was used for those who did not respond after three e-mail questionnaire was not modified for use in the two different survey modes. The impact of this difference is among the questions we examine in the eswe investigated whether survey mode influenced responses on questions about health status, health care utilization, health-related behaviors, and knowledge of the medical care delivery status the first measure is the widely used self-report of health status, in which respondents assess their health as excellent, very good, good, fair, or poor. Web respondents were provided with a “don't know” option as 6responses to the knowledge items, by experiment statusanalysisto answer the first research question—what is the benefit of offering sequential mixed mode design over a unimodal design, we calculated four response rates. Similarly, for those randomized to the web, we computed rates for web only responders and all then explored whether the additional respondents generated by offering the second mode differed with respect to demographics or health characteristics from those who responded using the primary mode. To the extent that the additional respondents differed, it suggests that the mixed mode survey reached a more representative sample. Given our interest in modal differences in survey responses, we sought to compare characteristics of switchers and nonswitchers using data provided in one consistent mode. This was only possible for telephone respondents due to the small number of people who switched from telephone to the answer the second research question on modal differences in survey responses, we focused only on those respondents in the experimental study who completed the survey in the intended mode.

2001), and calculated the likelihood of selecting the same response for all items, or straight soffering two survey modes sequentially significantly increased survey response rates compared with offering either an exclusively telephone or web-based survey. When we included the 41 people assigned to the telephone who switched mode to complete the survey over the web, the response rate for this group increased to 84 percent (209 out of 250). Tested whether those who switched survey mode (web to telephone) differed from those completing the survey in the assigned telephone mode. To the extent that the switchers differed in terms of sociodemographics, it suggests that including the telephone follow-up mode was successful in reaching a broader group of respondents and, thus, a more representative sample. In sum, offering the mixed mode design increased the response rate by 9–17 percentage points, and appears to have included more minority and low-income now turn to our analysis of modal differences in survey responses. We can fully attribute this difference to mode because in the prior year, when all responses were over the web, there were no differences in health status between these two groups. For measures of health care utilization and number of chronic conditions we find no significant differences by mode. We tested whether collapsing response options would remedy the strong mode effect, but the effect was still pronounced when we ran statistical testing with dichotomized 3responses to healthy behavior items, by experiment statustable 4 displays the risky cost saving and information seeking behaviors by mode. As overall there were more true items than false sionthis study uses a randomized experiment to examine the benefits and limitations of mixed mode web and telephone surveys. We find that the mixed mode design increased response rates over fielding exclusively telephone or web-based surveys by 25 and 12 percent, respectively. The study also finds that survey mode influences social desirability bias, missing data, acquiescence, and nondifferentiation—often in nontrivial ways. Below we discuss these patterns and discuss the practical implications for designing mixed mode modal differences in social desirability bias that we observed varied in magnitude considerably, depending on the type of questions. The magnitude of modal differences was quite substantial for these items, suggesting that these types of items not be included in mixed mode surveys or that analyses combining web and telephone data statistically adjust for survey results also revealed differences in provision of missing data for specific types of questions. Despite the fact that web respondents viewed a missing data option that was not mentioned over the telephone, there were very low levels of missing data regardless of mode for most questions. These observed modal differences underscore the importance of developing mixed mode surveys that are exactly the same for each mode, including the treatment of missing also found modal differences in agreeing to factual statements assessing knowledge. Based on these findings, we recommend avoiding agree/disagree and true/false type questions in mixed mode findings suggest that researchers can reduce modal differences by designing surveys that can be fielded identically in the two modes, avoiding sensitive questions about lifestyle, and not including true/false or agree/disagree questions. For surveys on personal lifestyle issues or sensitive matters, we recommend not mixing telephone and web modes. If non-web follow-up is necessary for such a survey, researchers should consider telephone audio computer-assisted self-interviewing or other modes that do not involve an interviewer. Finally, given the substantial modal differences we observed for some items, we caution researchers against mixing modes across waves of longitudinal studies. While the lower cost of the web may incentivize researchers to convert telephone respondents from one wave of a study to web respondents in subsequent waves, the change in mode would confound isolating changes due to the factor of addition to the lessons for mixed mode survey design, this study highlights the comparative strengths of web surveys. More research is needed to examine the effect of mode for those with lower socioeconomic status and less familiarity with the web. Additionally, we only examine one new survey mode, the web, while a number of other emerging survey approaches are being used. It will be important for survey researchers going forward to conduct randomized tests of the differences in responses when introducing new ledgmentsthe authors would like to thank “the changes in health care financing and organization” (hcfo) initiative, a program of the robert wood johnson foundation, for providing support for this ncesbowling a.

Effects of survey mode on self-reports of adult alcohol consumption: a comparison of mail, web and telephone approaches. Pubmed]articles from health services research are provided here courtesy of health research & educational s:article | pubreader | epub (beta) | pdf (118k) | ncbi web site requires javascript to tionresourceshow toabout ncbi accesskeysmy ncbisign in to ncbisign l listhealth serv resv. Pubmed]articles from health services research are provided here courtesy of health research & educational s:article | pubreader | epub (beta) | pdf (118k) | citationshare.