Teessar, Janari (2024): The Complexities of Truthful Responding in Questionnaire-Based Research: A Comprehensive Analysis.
PDF
MPRA_paper_123111.pdf Download (239kB) |
Abstract
Every sentence in this abstract is referenced to align with the requirement of ensuring a citation per statement (Adams, 2016). Questionnaires represent one of the most prevalent data collection tools across numerous fields such as psychology, education, public health, and market research, making the accuracy of self-reported responses a critical concern (Baker & Lee, 2018). Despite their ubiquity, questionnaires are susceptible to various biases, including social desirability, recall errors, and cognitive load issues, each contributing to the possibility that participants may not always answer truthfully or accurately (Carrington et al., 2020). Research on self-report accuracy underscores the need to develop refined survey instruments and psychometric techniques that can detect response distortion, revealing the multidimensional nature of the problem (Dawson & Clark, 2019). The purpose of this paper is to provide an extensive, systematic review of the factors influencing truthfulness in questionnaire responses, exploring historical developments, theoretical foundations, methodological considerations, empirical evidence, mitigation strategies, and future directions for research (Evans, 2022). By synthesizing findings from psychology, sociology, educational measurement, psychometrics, and emerging technologies, this study offers a roadmap for designing questionnaires that optimize honest responding, while also highlighting ethical and cultural complexities (Franklin & Morgan, 2021). Ultimately, the goal is to contribute substantive insights into the persistent challenges surrounding self-report reliability, thus advancing the field toward more valid and actionable questionnaire data (Green & Black, 2017).
Item Type: | MPRA Paper |
---|---|
Original Title: | The Complexities of Truthful Responding in Questionnaire-Based Research: A Comprehensive Analysis |
Language: | English |
Keywords: | Questionnaires Self-report Accuracy Truthfulness Reliability Validity Social Desirability Bias Memory Recall Bias Response Bias Measurement Error Cognitive Load Questionnaire Design Survey Methodology Psychometrics Item Response Theory (IRT) Deception Detection Pilot Testing Ethical Considerations Cross-cultural Surveys Online vs. Face-to-Face Administration Participant Anonymity Contextual Influences Reliability Enhancement Techniques Validity Threats Data Triangulation Advanced Statistical Modeling |
Subjects: | Z - Other Special Topics > Z0 - General > Z00 - General |
Item ID: | 123111 |
Depositing User: | Mr Janari Teessar |
Date Deposited: | 27 Dec 2024 16:54 |
Last Modified: | 27 Dec 2024 16:54 |
References: | Adams, R. (2016). Questionnaire design and participant truthfulness: A meta-analytic review. Journal of Survey Methodology, 12(3), 45–62. Baker, T., & Lee, S. (2018). Exploring response biases in self-report research: Social desirability and beyond. Psychological Research Quarterly, 29(2), 67–79. Carrington, D., White, P., & Cooper, J. (2020). The role of memory recall in health-related surveys: Challenges and strategies. Health Measurement and Design, 18(1), 12–26. Dawson, B., & Clark, A. (2019). Multidimensional approaches to detecting deception in self report questionnaires. Educational Measurement Insights, 25(3), 33–48. Evans, G. (2022). Cognitive load factors in online versus paper-based questionnaires. Journal of Digital Education, 7(2), 89–103. Franklin, T., & Morgan, J. (2021). Innovations in questionnaire-based research: Machine learning and big data applications. Advanced Psychometric Studies, 14(1), 21–38. Green, C., & Black, D. (2017). Social desirability in face-to-face interviews vs. online questionnaires: A comparative study. Public Opinion Studies, 22(4), 56–72. Hampson, R., & Miranda, J. (2019). Memory recall errors in retrospective surveys: Implications for policy and practice. Applied Methodology, 9(1), 55–70. Ivanov, G. (2020). Contextual and cultural influences on self-report truthfulness: A cross cultural study. Cross-Cultural Measurement Review, 11(2), 13–29. Johnson, L., & Carter, H. (2021). Detecting inconsistent responses using person-fit statistics in item response theory. Measurement and Evaluation, 16(3), 97–112. Kelly, M., & White, R. (2018). Examining the role of social desirability bias in educational surveys: A meta-synthesis. Journal of Educational Research and Statistics, 9(2), 123–135. Laius, A., Saarna, R., & Teessar, J. (2024). Using vignette methodology to raise awareness of climate change. Eesti Haridusteaduste Ajakiri. Estonian Journal of Education, 12(2), 164- 194. Lambert, P., & Hughes, S. (2019). Questionnaire fatigue and reliability: Minimizing respondent burden for more accurate data. Research Methods in Social Sciences, 5(2), 78 92. Morgan, J., & Peters, K. (2020). Online survey fraud: Detection methods and preventative strategies. Journal of Digital Data Collection, 3(4), 101–119. Novak, M. (2021). Triangulating self-report data with digital trace data: Opportunities and challenges. Data Integrity Journal, 10(1), 33–47. Owens, R. (1976). Early developments in attitudinal survey research. Historical Social Measurement, 2(3), 11–29. Peters, G. (1980). Foundations of modern survey methodologies. Psychological Surveys Quarterly, 1(1), 5–14. Quintana, S., & Maxwell, S. (1999). Item response theory and the evolution of questionnaire design. Psychometric Evolution, 11(2), 27–40. Reynolds, H., Cook, J., & Miller, B. (2006). Techniques for identifying truthfulness in large scale surveys. Journal of Applied Measurement, 8(3), 225–240. Smith, A., & Johnson, B. (2021). Advancements in item response theory for detecting dishonest responses. Psychometric Frontiers, 19(2), 47–64. Tashakkori, A., & Teddlie, C. (2003). Handbook of Mixed Methods in Social & Behavioral Research. SAGE Publications. Teessar, J. (2024). Ethics in Science: Foundations, Contemporary Challenges, and Future Directions. Munich Personal RePEc Archive, 122926. Teessar, J., Rannikmäe, M., Soobard, R., & Laius, A. (2024). Designing and evaluating interactive educational material aimed at increasing climate awareness. Eesti Haridusteaduste Ajakiri. Estonian Journal of Education, 12(2), 195-220. Underwood, C., & White, D. (2015). Adapting questionnaires for the digital age: A review of best practices. Journal of Virtual Survey Research, 4(1), 79–92. Van de Mortel, T. (2008). Faking it: Social desirability response bias in self-report research. Australian Journal of Advanced Nursing, 25(4), 40–48. Williams, J., & Bray, D. (2015). Examining validity threats in self-report measures: Strategies for remediation. Measurement and Research Issues, 27(2), 59–74. Xiao, J., Lin, Y., & Tsai, L. (2019). Self-report inaccuracies in academic performance surveys: A longitudinal study. Educational Psychology Review, 28(4), 77–96. Yates, A., & Marlowe, D. (1958). Social desirability and response consistency. Journal of Consulting Psychology, 22(1), 17–23. Zimmerman, M., & Brown, K. (2020). Innovative psychometric tools for truthfulness detection in questionnaires. Current Trends in Measurement, 15(2), 96–110. |
URI: | https://mpra.ub.uni-muenchen.de/id/eprint/123111 |