Cognitive pretesting

Summary

Cognitive pretesting, or cognitive interviewing, is a field research method where data is collected on how the subject answers interview questions. It is the evaluation of a test or questionnaire before it's administered.[1] It allows survey researchers to collect feedback regarding survey responses and is used in evaluating whether the question is measuring the construct the researcher intends. The data collected is then used to adjust problematic questions in the questionnaire before fielding the survey to the full sample of people.[2][3][4][5][6]

Cognitive interviewing generally collects the following information from participants: evaluations on how the subject constructed their answers; explanations on what the subject interprets the questions to mean; reporting of any difficulties the subject had in answering the questions; and anything else that reveals the circumstances to the subject's answers.

Cognitive pretesting is considered essential in testing the validity of an interview, test, or questionnaire.[7]

Purpose edit

The purpose of these pretests is to:

  • make sure that the test or interview is understandable
  • address any problems the participants may have had with the test
  • measure participants attention and curiosity to the questions
  • measure the scale of answers (Ex: is the whole scale being used, or do answers vary too much)
  • assess question order and other context effects
  • problems with the interviewers
  • address any technical problems with the test (Ex: glitches with any technology, or grammatical errors)
  • and how long it takes to take the test or interview.[1][4][6]

Types edit

In general, there are many methods practiced when conducting a cognitive pretest. Including: conventional pretesting, cognitive interviewing, behavior coding, respondent debriefing, group discussion, expert review, eye tracking, and web probing.[1]

Conventional pretesting-This is similar to a rehearsal that tries to imitate and model after what the real test or interview will be like. A simulation of real test or interview that takes place prior to the real one. Whatever method used in the actual interview or test should be used in this method of pretesting.[1][8]

Cognitive pretesting (cognitive interviewing)- very similar to conventional pretesting. However, the participants are actively being asked about the questions as they take the test. It's conducted during the interview or test.[1][6]

They can also be presented in multiple different ways including: written surveys, oral surveys, electronic surveys[4]

Techniques edit

There are certain techniques that the interviewer implements in cognitive pretesting to extract the information needed to ensure a good interview or questionnaire.

The think-aloud technique- This occurs when the interviewer asks the interviewee to vocalize their thoughts and how they came to their answer. This can be concurrent (during) or retrospective (after) the interview.[1][2]

Probing technique- This occurs when the interviewer asks the interviewee one or more follow-up questions. They 'probe' about the questions asked, terminology used, or even the responses.[1][2] Probes can be concurrent (during the task but not to be disruptive of the task) or retrospective (after the task).[9]

Paraphrasing- This occurs when the interviewer asks the interviewee to use their own words to repeat the question. This tests to make sure the questions are understandable.[1]

Confidence rating- This occurs when the interviewer asks the interviewee about their confidence in how correctly they answered the question.[1]

Sorting or Card Sorting- This occurs when the interviewer asks the interviewee or tries to understand how the interviewee categorizes certain situations or even terms.[1][9]

Vignettes- These are short descriptions of one or more hypothetical characters (similar to vignette used in psychological and sociological experiments or anchoring vignettes in quantitative survey research) and are used to investigate the respondent's cognitive processing with regard to their survey-relevant decisions.[9][10]

Web probing- This technique implements cognitive interview probing techniques in web surveys. Its strengths include standardization, anonymity, and large and fast coverage because it is administered via the web. However, web probing can only reach online population groups, there is probe nonresponse, and insufficient probe answers from a content perspective cannot be followed up.[11][12]

Participants and recruitment edit

Sample size is a very important topic in pretests. Small samples of 5-15 participants are common. While some researchers suggest that it is best if the sample size is at least 30 people and more is always better,[13] the current best practice is to design the research in rounds to retest changes. For example, when pretesting a questionnaire, it is more useful to conduct 3 rounds of 9 participants than 1 round of 27.[9]

There are two different methods of telling participants about the questionnaire: participating pretests and undeclared pretest.[4]

  • Participating pretests make sure that the participants know that the test they are completing is just a practice run. This is used mostly in probing and thinking-out-loud technique, or cognitive pretesting and interviewing[4]
  • Undeclared pretests are tests in which the participants don't know that this is a practice run. This is most like conventional pretesting. If more than one pretest occurs on a study it is recommended that participating pretest is first, then undeclared second.[4][6]

Cross-cultural research edit

When conducting cognitive interviews in non-English languages, recent research recommend not restricting sample selection and recruitment to non-English speaking monolinguals, which was a common practice by survey researchers.[14][15] When recruiting hard-to-reach respondents and respondent characteristics via purposive sampling, community-based recruitment (word of mouth, endorsement from community leaders) works better than advertisements.[16][17][18]

Use by survey researchers edit

Cognitive interviewing is regularly practiced by U.S. Federal Agencies, including the Census Bureau,[19][20] National Center for Health Statistics (NCHS),[21] and the Bureau of Labor Statistics.[22] The NCHS maintains a database of U.S. and international agencies that have conducted cognitive interview projects and contributed reports to their depository, such as the National Science Foundation and GESIS – Leibniz Institute for the Social Sciences.[23]

Cross-cultural cognitive interviewing is practiced to evaluate survey question equivalence and sources of difficulties, as well as to repair problems related to translation.[24][25] Because of differences in communication styles and cultural norms, adaptations are needed in protocol setup[26] and design,[27] use of vignettes,[10] and verbal probing.[28]

Standards edit

In October 2016, the U.S. Office of Management and Budget (OMB) issued Statistical Policy Directive No. 2 Addendum: Standards and Guidelines for Cognitive Interviews that included seven standards for cognitive interviews conducted by or for U.S. Federal studies.[1] Another standard proposed by researchers is the Cognitive Interviewing Reporting Framework (CIRF) that applies a 10-category checklist to make clear what was done during the cognitive interviews and how conclusions were made based on procedures and results of those interviews.[29] In addition, a project management approach is recommended when managing cognitive interviewing studies.[30] For translated surveys, cognitive interviewing techniques, participant selection and recruitment, and project management approach must be adapted to increase their fit for use.[27]

References edit

  1. ^ a b c d e f g h i j Lenzner, Timo; Neuert, Cornelia; Otto, Wanda (2016). "Kognitives Pretesting". Gesis Survey Guidelines. doi:10.15465/gesis-sg_en_010.
  2. ^ a b c Tilley, Barbara C.; LaPelle, Nancy R.; Goetz, Christopher G.; Stebbins, Glenn T. (2014). "Using Cognitive Pretesting in Scale Development for Parkinson's Disease: The Movement Disorder Society Unified Parkinson's Disease Rating Scale (MDS-UPDRS) Example". Journal of Parkinson's Disease. 4 (3): 395–404. doi:10.3233/JPD-130310. ISSN 1877-7171. PMC 5086096. PMID 24613868.
  3. ^ "Pretesting - Cross-Cultural Survey Guidelines". ccsg.isr.umich.edu. Retrieved 2020-07-18.
  4. ^ a b c d e f "Writing@CSU". writing.colostate.edu. Retrieved 2020-07-18.
  5. ^ Grimm, Pamela (2010), "Pretesting a Questionnaire", Wiley International Encyclopedia of Marketing, American Cancer Society, doi:10.1002/9781444316568.wiem02051, ISBN 978-1-4443-1656-8
  6. ^ a b c d Babonea, Alina-Mihaela; Voicu, Mirela-Cristina (April 2011). "Questionnaires pretesting in marketing research". Challenges of the Knowledge Society. 1. Romania: Nicolae Titulescu University Publishing House: 1323–1330. ISSN 2068-7796.
  7. ^ "GESIS - Leibniz Institute for the Social Sciences". www.gesis.org. Retrieved 2020-07-18.
  8. ^ Hu, Shu (2014), "Pretesting", in Michalos, Alex C. (ed.), Encyclopedia of Quality of Life and Well-Being Research, Dordrecht: Springer Netherlands, pp. 5048–5052, doi:10.1007/978-94-007-0753-5_2256, ISBN 978-94-007-0753-5
  9. ^ a b c d Willis, Gordon (2005). Cognitive interviewing: A tool for improving questionnaire design. Sage. p. 146. ISBN 9780761928041.
  10. ^ a b Sha, Mandy (2016-08-01). "The Use of Vignettes in Evaluating Asian Language Questionnaire Items". Survey Practice. 9 (3). doi:10.29115/SP-2016-0013.
  11. ^ "Web Probing". GESIS - Leibniz Institute for the Social Sciences. Retrieved 2023-10-24.
  12. ^ Fowler, Stephanie; B. Willis, Gordon (2020-01-02), Beatty, Paul; Collins, Debbie; Kaye, Lyn; Padilla, Jose Luis (eds.), "The Practice of Cognitive Interviewing Through Web Probing", Advances in Questionnaire Design, Development, Evaluation and Testing (1 ed.), Wiley, pp. 451–469, doi:10.1002/9781119263685.ch18, ISBN 978-1-119-26362-3, retrieved 2023-10-24
  13. ^ Perneger, Thomas V.; Courvoisier, Delphine S.; Hudelson, Patricia M.; Gayet-Ageron, Angèle (2015-01-01). "Sample size for pre-tests of questionnaires". Quality of Life Research. 24 (1): 147–151. doi:10.1007/s11136-014-0752-2. ISSN 1573-2649. PMID 25008261. S2CID 22314144.
  14. ^ Park, Hyunjoo; Sha, M. Mandy; Willis, Gordon (November 2016). "Influence of English-language Proficiency on the Cognitive Processing of Survey Questions". Field Methods. 28 (4): 415–430. doi:10.1177/1525822X16630262. ISSN 1525-822X.
  15. ^ Goerman, Patricia L.; Meyers, Mikelyn; Sha, Mandy (2018-10-12), Johnson, Timothy P.; Pennell, Beth‐Ellen; Stoop, Ineke A.L.; Dorer, Brita (eds.), "Working Toward Comparable Meaning of Different Language Versions of Survey Instruments: Do Monolingual and Bilingual Cognitive Testing Respondents Help to Uncover the Same Issues?", Advances in Comparative Survey Methods (1 ed.), Wiley, pp. 251–269, doi:10.1002/9781118884997.ch12, ISBN 978-1-118-88498-0, retrieved 2023-10-24
  16. ^ Sha, M. Mandy; Park, Hyunjoo; Liu, Lu (2013-10-01). "Exploring the efficiency and utility of methods to recruit non-English speaking qualitative research participants". Survey Practice. 6 (3). doi:10.29115/SP-2013-0015.
  17. ^ Park, Hyunjoo; Sha, M. Mandy (2014-06-01). "Evaluating the Efficiency of Methods to Recruit Asian Research Participants". Journal of Official Statistics. 30 (2): 335–354. doi:10.2478/jos-2014-0020.
  18. ^ Sha, Mandy; Moncada, Jennifer (2017-06-01). "Successful Techniques to Recruit Hispanic and Latino Research Participants". Survey Practice. 10 (3). doi:10.29115/SP-2017-0014.
  19. ^ Virgile, M.; Katz, J.; Tuttle, D.; Terry, R.; Graber, J. (2019). "Cognitive Pretesting of 2019 American Housing Survey Modules". United States Census Bureau. Archived from the original on 2020-08-08. Retrieved 2020-05-20.
  20. ^ Childs, Jennifer; Sha, Mandy; Peytcheva, Emilia. "Cognitive Testing of the Targeted Coverage Follow-up (TCFU) Interview". Census Working Papers. Retrieved October 4, 2023.
  21. ^ "Q-Bank: Question Evaluation for Surveys". wwwn.cdc.gov. Retrieved 2023-11-05.
  22. ^ K, Schwartz, Lisa. "The American Time Use Survey: cognitive pretesting : Monthly Labor Review: U.S. Bureau of Labor Statistics". www.bls.gov. Retrieved 2020-05-27.{{cite web}}: CS1 maint: multiple names: authors list (link)
  23. ^ "Explore Reports by Agency - Q-Bank". wwwn.cdc.gov. Retrieved 2023-11-05.
  24. ^ Willis, Gordon (May 2, 2015). "The Practice of Cross-Cultural Cognitive Interviewing". Public Opinion Quarterly. 79 (S1): 359–395.
  25. ^ Aizpurua, Eva (2020). Sha, Mandy; Gabel, Tim (eds.). Pretesting methods in cross-cultural research (Chapter 7) in The essential role of language in survey research. RTI Press. pp. 129–150. doi:10.3768/rtipress.bk.0023.2004. ISBN 978-1-934831-23-6.
  26. ^ Park, Hyunjoo; Goerman, Patricia; Sha, Mandy (2017-06-01). "Exploring the Effects of Pre-interview Practice in Asian Language Cognitive Interviews". Survey Practice. 10 (3). doi:10.29115/SP-2017-0019.
  27. ^ a b Sha, Mandy; Pan, Yuling (2013-12-01). "Adapting and Improving Methods to Manage Cognitive Pretesting of Multilingual Survey Instruments". Survey Practice. 6 (4). doi:10.29115/SP-2013-0024.
  28. ^ Mneimneh, Zeina Nazih (2018-07-25). Sha, Mandy; Behr, Dorothée (eds.). "Probing for sensitivity in translated survey questions: Differences in respondent feedback across cognitive probe types". Translation & Interpreting. 10 (special issue on translation of questionnaires in cross-national and cross-cultural research): 73–88. ISSN 1836-9324.
  29. ^ Boeije, Hennie; Willis, Gordon (August 2013). "The Cognitive Interviewing Reporting Framework (CIRF)". Methodology. 9 (3): 87–95. doi:10.1027/1614-2241/a000075. ISSN 1614-1881.
  30. ^ Sha, Mandy; Childs, Jennifer Hunter (2014-08-01). "Applying a project management approach to survey research projects that use qualitative methods". Survey Practice. 7 (4). doi:10.29115/SP-2014-0021.