Multiple-Choice Questions in Basic Biomedical Science Module
DOI:
https://doi.org/10.23887/jpp.v57i1.63314Keywords:
Multiple Choice Questions, Item Difficulty Index, Item Discrimination IndexAbstract
The evaluation process in medical education involves evaluating knowledge, skills, and attitudes based on the achievements and competencies to be achieved. The multiple-choice question (MCQ) instrument is an assessment instrument often used in the medical field. So far, the MCQs have been implemented in Indonesia's national medical competency examination. Furthermore, maintaining the quality of MCQs at the faculty level is very important to maintain the quality of medical graduates. In this study, an evaluation was carried out on 250 items of MCQs in three basic biomedical modules, followed by analyses of the MCQs characteristics, item difficulty index (DIF-I), and item discrimination index (DI). The analysis found that the Kr-20 value was >0.8 in the three modules. Analysis of the item difficulty index (DIF-I) in the ideal category obtained 33 (36.7%), 29 (38.7%), and 34 (39.5%), respectively. The ideal category's item discrimination index (DI) was 63.3%, 77.3%, and 69.4%, respectively. The results of this study illustrate that there are still MCQs that are not ideal and need attention for future improvements. These results prove that more work must be done to improve the standard of MCQs used in medical examinations. Periodic evaluation and training on making standardized multiple-choice question components need to be planned within the faculty.
References
Abdulghani, H. M., Ahmad, F., Irshad, M., Khalil, M. S., Al-Shaikh, G. K., Syed, S., & Haque, S. (2015). Faculty development programs improve the quality of multiple choice questions items’ writing. Scientific Reports, 5(1). https://doi.org/10.1038/srep09556. DOI: https://doi.org/10.1038/srep09556
Adam, S. K., Idris, F., Kassim, P. S. J., Zakaria, N. F., & Hod, R. (2021). Multiple Choice Questions with Different Numbers of Options in University Putra Malaysia Undergraduate Medical Program: A Comparative Analysis in 2017 and 2018. Journal of Medical Education, 20(2). https://doi.org/10.5812/jme.116834. DOI: https://doi.org/10.5812/jme.116834
Adiga, M. N. S., Acharya, S., & Holla, R. (2021). Item analysis of multiple-choice questions in pharmacology in an Indian Medical School. Journal of Health and Allied Sciences NU, 11(3), 130–135. https://doi.org/10.1055/s-0041-1722822. DOI: https://doi.org/10.1055/s-0041-1722822
AlFaris, E., Naeem, N., Irfan, F., Qureshi, R., Saad, H., Al Sadhan, R. E., & Van der Vleuten, C. (2015). A One-Day Dental Faculty Workshop in Writing Multiple-Choice Questions: An Impact Evaluation. Journal of Dental Education, 79(11), 1305–1313. https://doi.org/10.1002/j.0022-0337.2015.79.11.tb06026.x. DOI: https://doi.org/10.1002/j.0022-0337.2015.79.11.tb06026.x
Baig, M., Ali, S. K., Ali, S., & Huda, N. (2014). Quality evaluation of assessment tools: OSPE, SEQ & MCQ. Pakistan Journal of Medical Sciences, 30(1), 3–6. https://doi.org/10.12669/pjms.301.4458. DOI: https://doi.org/10.12669/pjms.301.4458
Bhattacherjee, S., Mukherjee, A., Bhandari, K., & Rout, A. J. (2022). Evaluation of Multiple-Choice Questions by Item Analysis, from an Online Internal Assessment of 6(th) Semester Medical Students in a Rural Medical College, West Bengal. Indian Journal of Community Medicine, 47(1), 92–95. https://doi.org/10.4103/ijcm.ijcm_1156_21 DOI: https://doi.org/10.4103/ijcm.ijcm_1156_21
Biswas, S. S., Jain, V., Agrawal, V., & Bindra, M. (2015). Small group learning: effect on item analysis and accuracy of self-assessment of medical students. Education for Health, 28(1), 16–21. https://doi.org/10.4103/1357-6283.161836. DOI: https://doi.org/10.4103/1357-6283.161836
Boulet, J. R., & Durning, S. J. (2019). What we measure… and what we should measure in medical education. Medical Education, 53(1), 86–94. https://doi.org/10.1111/medu.13652. DOI: https://doi.org/10.1111/medu.13652
Butler, A. C. (2018). Multiple-Choice Testing in Education: Are the Best Practices for Assessment Also Good for Learning? Journal of Applied Research in Memory and Cognition, 7(3), 323–331. https://doi.org/10.1016/j.jarmac.2018.07.002. DOI: https://doi.org/10.1016/j.jarmac.2018.07.002
Christian, D. S., Prajapati, A. C., Rana, B. M., & Dave, V. R. (2017). Evaluation of multiple choice questions using item analysis tool: a study from a medical institute of Ahmedabad, Gujarat. International Journal Of Community Medicine And Public Health, 4(6), 1876–1881. https://doi.org/10.18203/2394-6040.ijcmph20172004. DOI: https://doi.org/10.18203/2394-6040.ijcmph20172004
D’Sa, J. L., & Visbal-Dionaldo, M. L. (2017). Analysis of Multiple Choice Questions: Item Difficulty, Discrimination Index and Distractor Efficiency. International Journal of Nursing Education, 9(3). https://doi.org/10.5958/0974-9357.2017.00079.4. DOI: https://doi.org/10.5958/0974-9357.2017.00079.4
Darmayani, I. G. A. S. (2022). The determinants of medical student learning behavior that are associated with the outcome of the Indonesia Medical Doctor National Competency Examination : A review. Bali Medical Journal, 11(3), 2085–2089. https://doi.org/10.15562/bmj.v11i3.4426.
Darodjat, D., & Wahyudhiana, W. (2015). Model evaluasi program pendidikan. Islamadina: Jurnal Pemikiran Islam, 14(1), 1–28. https://doi.org/10.30595/islamadina.v0i0.1665.
Daryono, R. W., Hariyanto, V. L., Usman, H., & Sutarto, S. (2020). Factor analysis: Competency framework for measuring student achievements of architectural engineering education in Indonesia. REID (Research and Evaluation in Education), 6(2), 98–108. https://doi.org/10.21831/reid.v6i2.32743. DOI: https://doi.org/10.21831/reid.v6i2.32743
Dellinges, M. A., & Curtis, D. A. (2017). Will a short training session improve multiple‐choice item‐writing quality by dental school faculty? A pilot study. Journal of Dental Education, 81(8), 948–955. https://doi.org/10.21815/JDE.017.047. DOI: https://doi.org/10.21815/JDE.017.047
Donnelly, C. (2014). The use of case based multiple choice questions for assessing large group teaching: implications on student’s learning. Irish Journal of Academic Practice, 3(1), 1–15. https://doi.org/10.21427/D7CX32.
Elgadal, A. H., & Mariod, A. A. (2021). Item Analysis of Multiple-choice Questions (MCQs): Assessment Tool For Quality Assurance Measures. Sudan Journal of Medical Sciences, 16(3), 334–346. https://doi.org/10.18502/sjms.v16i3.9695. DOI: https://doi.org/10.18502/sjms.v16i3.9695
Gupta, P., Meena, P., Khan, A. M., Malhotra, R. K., & Singh, T. (2020). Effect of Faculty Training on Quality of Multiple-Choice Questions. International Journal of Applied and Basic Medical Research, 10(3), 210–214. https://doi.org/10.4103/ijabmr.IJABMR_30_20. DOI: https://doi.org/10.4103/ijabmr.IJABMR_30_20
Harden, R. M. (2018). Ten key features of the future medical school—not an impossible dream. Medical Teacher, 40(10), 1010–1015. https://doi.org/10.1080/0142159X.2018.1498613. DOI: https://doi.org/10.1080/0142159X.2018.1498613
Hijji, B. M. (2017). Flaws of multiple choice questions in teacher-constructed nursing examinations: A pilot descriptive study. Journal of Nursing Education, 56(8), 490–496. https://doi.org/10.3928/01484834-20170712-08. DOI: https://doi.org/10.3928/01484834-20170712-08
Kaur, M., Singla, S., & Mahajan, R. (2016). Item analysis of in use multiple choice questions in pharmacology. International Journal of Applied and Basic Medical Research, 6(3), 170–173. https://doi.org/10.4103/2229-516X.186965. DOI: https://doi.org/10.4103/2229-516X.186965
Kibble, J. D. (2017). Best practices in summative assessment. Advances in Physiology Education, 41(1), 110–119. https://doi.org/10.1152/advan.00116.2016. DOI: https://doi.org/10.1152/advan.00116.2016
Kumar, D., Jaipurkar, R., Shekhar, A., Sikri, G., & Srinivas, V. (2021). Item analysis of multiple choice questions: A quality assurance test for an assessment tool. Medical Journal Armed Forces India, 77, S85–S89. https://doi.org/10.1016/j.mjafi.2020.11.007. DOI: https://doi.org/10.1016/j.mjafi.2020.11.007
Kurtz, J. B., Lourie, M. A., Holman, E. E., Grob, K. L., & Monrad, S. U. (2019). Creating assessments as an active learning strategy: what are students’ perceptions? A mixed methods study. Medical Education Online, 24(1). https://doi.org/10.1080/10872981.2019.1630239. DOI: https://doi.org/10.1080/10872981.2019.1630239
León, S. P., Panadero, E., & García-Martínez, I. (2023). How accurate are our students? A meta-analytic systematic review on self-assessment scoring accuracy. Educational Psychology Review, 35(4), 106. https://doi.org/10.1007/s10648-023-09819-0. DOI: https://doi.org/10.1007/s10648-023-09819-0
Liew, C. P., Puteh, M., Mohammad, S., Omar, A. A., & Kiew, P. L. (2021). Review of engineering programme outcome assessment models. European Journal of Engineering Education, 46(5), 834–848. https://doi.org/10.1080/03043797.2020.1852533. DOI: https://doi.org/10.1080/03043797.2020.1852533
Lockyer, J., Carraccio, C., Chan, M. K., Hart, D., Smee, S., & Touchie, C. (2017). Core principles of assessment in competency-based medical education. Medical Teacher, 39(6), 609–616. https://doi.org/10.1080/0142159X.2017.1315082. DOI: https://doi.org/10.1080/0142159X.2017.1315082
Mahda, A., Arfiyanti, M. P., Novitasari, A., & Romadhoni, R. (2023). Analisis Deskriptif Kualitas Soal Multiple Choice Questions (MCQ) Mini Kuis Tutorial di Fakultas Kedokteran Universitas Muhammadiyah Semarang. Jurnal Ilmu Kedokteran Dan Kesehatan, 10(6), 2177–2184. https://doi.org/10.33024/jikk.v10i6.9932. DOI: https://doi.org/10.33024/jikk.v10i6.9932
Mardiah, M., & Syarifudin, S. (2018). Model-Model Evaluasi Pendidikan. MITRA ASH-SHIBYAN: Jurnal Pendidikan Dan Konseling, 2(1), 38–50. https://doi.org/10.46963/mash.v2i1.24. DOI: https://doi.org/10.46963/mash.v2i1.24
Nojomi, M., & Mahmoudi, M. (2022). Assessment of multiple-choice questions by item analysis for medical students’ examinations. Research and Development in Medical Education, 11(1), 24. https://doi.org/10.34172/rdme.2022.024. DOI: https://doi.org/10.34172/rdme.2022.024
Przymuszała, P., Piotrowska, K., Lipski, D., Marciniak, R., & Cerbin-Koczorowska, M. (2020). Guidelines on Writing Multiple Choice Questions: A Well-Received and Effective Faculty Development Intervention. SAGE Open, 10(3). https://doi.org/10.1177/2158244020947432. DOI: https://doi.org/10.1177/2158244020947432
Pugh, D., & Regehr, G. (2016). Taking the sting out of assessment: is there a role for progress testing? Medical Education, 50(7), 721–729. https://doi.org/10.1111/medu.12985. DOI: https://doi.org/10.1111/medu.12985
Rahma, N. A., Shamad, M. M., Idris, M. E., Elfaki, O. A., Elfakey, W. E., & Salih, K. M. (2017). Comparison in the quality of distractors in three and four options type of multiple choice questions. Advances in Medical Education and Practice, 8, 287–291. https://doi.org/10.2147/AMEP.S128318. DOI: https://doi.org/10.2147/AMEP.S128318
Rodríguez, S. L. (2014). El aprendizaje basado en problemas para la educación médica: sus raíces epistemológicas y pedagógicas. Revista Med, 22(2), 32–36. https://doi.org/10.18359/rmed.1168. DOI: https://doi.org/10.18359/rmed.1168
Rush, B. R., Rankin, D. C., & White, B. J. (2016). The impact of item-writing flaws and item complexity on examination item difficulty and discrimination value. BMC Medical Education, 16, 1–10. https://doi.org/10.1186/s12909-016-0773-3. DOI: https://doi.org/10.1186/s12909-016-0773-3
Shafira, N. N. A. (2015). Peran MCQ Sebagai Instrumen Evaluasi Dalam Pendidikan Kedokteran. Jambi Medical Journal: Jurnal Kedokteran Dan Kesehatan, 3(2), 132–139. https://doi.org/10.22437/jmj.v3i2.3089.
Tavakol, M., & Dennick, R. (2017). The foundations of measurement and assessment in medical education. Medical Teacher, 39(10), 1010–1015. https://doi.org/10.1080/0142159X.2017.1359521. DOI: https://doi.org/10.1080/0142159X.2017.1359521
Ten Cate, O., & Regehr, G. (2019). The Power of Subjectivity in the Assessment of Medical Trainees. Academic Medicine, 94(3), 333–337. https://doi.org/10.1097/ACM.0000000000002495. DOI: https://doi.org/10.1097/ACM.0000000000002495
Utomo, P. S., Randita, A. B. T., Riskiyana, R., Kurniawan, F., Aras, I., Abrori, C., & Rahayu, G. R. (2022). Predicting medical graduates’ clinical performance using national competency examination results in Indonesia. BMC Medical Education, 22(1), 254. https://doi.org/10.1186/s12909-022-03321-x. DOI: https://doi.org/10.1186/s12909-022-03321-x
Xu, X., Kauer, S., & Tupy, S. (2016). Multiple-choice questions: Tips for optimizing assessment in-seat and online. Scholarship of Teaching and Learning in Psychology, 2(2), 147–158. https://doi.org/10.1037/stl0000062. DOI: https://doi.org/10.1037/stl0000062
Zaidi, N. L. B., Grob, K. L., Monrad, S. M., Kurtz, J. B., Tai, A., Ahmed, A. Z., & Santen, S. A. (2018). Pushing Critical Thinking Skills With Multiple-Choice Questions: Does Bloom’s Taxonomy Work? Academic Medicine, 93(6), 856–859. https://doi.org/10.1097/ACM.0000000000002087. DOI: https://doi.org/10.1097/ACM.0000000000002087
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 Made Bayu Permasutha, Gandes Retno Rahayu, Made Kurnia Widiastuti Giri, Dewa Agung Gde Fanji Pradiptha
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with Jurnal Pendidikan dan Pengajaran agree to the following terms:- Authors retain copyright and grant the journal the right of first publication with the work simultaneously licensed under a Creative Commons Attribution License (CC BY-SA 4.0) that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work. (See The Effect of Open Access)