Items Quality Analysis Using Rasch Model To Measure Elementary School Students’ Critical Thinking Skill On Stem Learning


  • Ghullam Hamdu Universitas Pendidikan Indonesia
  • F N Fuadi Elementary School Teachers Education, Universitas Pendidikan Indonesia, Tasikmalaya
  • A Yulianto Elementary School Teachers Education, Universitas Pendidikan Indonesia, Tasikmalaya
  • Y S Akhirani Elementary School Teachers Education, Universitas Pendidikan Indonesia, Tasikmalaya



Items'quality, The Rasch Model, Critical Thinking.


Critical thinking as one of the 21st century competences required by students needs to be developed and analyzed by employing qualified assessment instrument. Test is a kind of critical thinking assessment instrument which quality is developed and analysed to create a meaningful learning. A total of 10 multiple choices items were developed based on critical thinking indicators. The items were then given to forty two 4th grade students in one of the elementary schools in Tasikmalaya-West Java after obtaining STEM learning. Focus group discussions were conducted to construct and validate the instrument. The result of the test was analyzed using Rasch model with the assistance of Winsteps software version 3.75. The results indicated that the analysis using the Rasch model could explain the critical thinking items’ quality based on the level of difficulty and suitability and could categorize students’ abilities and their suitability for STEM learning conducted.


Andrich, D. (1981). Book Review : Probabilistic Models for Some Intelligence and Attainment Tests (expanded edition. Applied Psychological Measurement, 5(4), 545–550.

Arikunto, S. (2012). Dasar-Dasar Evaluasi Pendidikan (kedua). Bumi Aksara.

Assaraf, O. B. Z., & Orion, N. (2010). System thinking skills at the elementary school level. Journal of Research in Science Teaching, 47(5), 540–563.

Baghaei, P., & Amrahi, N. (2011). The effects of the number of options on the psychometric characteristics of multiple choice items. Psychological Test and Assessment Modeling, 53(2), 192–211.

Barnhart, T., & van Es, E. (2015). Studying teacher noticing: EXAMINING the relationship among pre-service science teachers’ ability to attend, analyze and respond to student thinking. Teaching and Teacher Education, 45, 83–93.

Bezanilla, M. J., Fernández-Nogueira, D., Poblete, M., & Galindo-Domínguez, H. (2019). Methodologies for teaching-learning critical thinking in higher education: The teacher’s view. Thinking Skills and Creativity, 33(February), 100584.

Blackley, S., & Howell, J. (2015). A STEM narrative: 15 years in the making. Australian Journal of Teacher Education, 40(7), 102–112.

Bond, T. G., & Fox, C. M. (2015). Applying the Rasch Model. In Taylor & Francis.

Boone, W. J., & Scantlebury, K. (2006). The role of rasch analysis when conducting science education research utilizing multiple-choice tests. Science Education, 90(2), 253–269.

Boone, W. J., Yale, M. S., & Staver, J. R. (2014). Rasch analysis in the human sciences. In Rasch Analysis in the Human Sciences.

Burrows, A., & Slater, T. (2015). A proposed integrated STEM framework for contemporary teacher preparation. Teacher Education & Practice, 28(2/3), 318–331.

Fahmina, S. S., Masykuri, M., Ramadhani, D. G., & Yamtinah, S. (2019). Content validity uses Rasch model on computerized testlet instrument to measure chemical literacy capabilities. AIP Conference Proceedings, 2194(December).

Fisher, A. (2011). Critical Thinking: An Introduction (Second). Cambrige University Press.

Goodwin, L. D., & Leech, N. L. (2003). The Meaning of Validity in the New Standards for Educational and Psychological Testing: Implications for Measurement Courses. Measurement and Evaluation in Counseling and Development, 36(3), 181–191.

Honey, M. A., Pearson, G., & Schweingruber, H. (2014). STEM integration in K-12 education: status, prospects, and an agenda for research. In STEM Integration in K-12 Education: Status, Prospects, and an Agenda for Research.

Hugerat, M., & Kortam, N. (2014). Improving higher order thinking skills among freshmen by teaching science through inquiry. Eurasia Journal of Mathematics, Science and Technology Education, 10(5), 447–454.

Kay, K., & Greenhill, V. (2011). Bringing Schools into the 21st Century. Bringing Schools into the 21st Century, 41–65.

Kek, M. Y. C. A., & Huijser, H. (2011). The power of problem-based learning in developing critical thinking skills: Preparing students for tomorrow’s digital futures in today’s classrooms. Higher Education Research and Development, 30(3), 329–341.

Kivunja, C. (2015). Exploring the Pedagogical Meaning and Implications of the 4Cs “Super Skills” for the 21st Century through Bruner’s 5E Lenses of Knowledge Construction to Improve Pedagogies of the New Learning Paradigm. Creative Education, 06(02), 224–239.

OECD. (2019). PISA 2018 insights and interpretations. OECD Publishing, 64. 2018 Insights and Interpretations FINAL PDF.pdf

Panprueksa, K., Phonphok, N., Boonprakob, M., & Dahsah, C. (2012). Thai Students’ Conceptual Understanding on Force and Motion. International Conference on Education and Management Innovation , 30.

Pedro, C.-I., & (2013). Spending more or Spending Better : 76404. March, 1–6.

Putra, P. D. A., & Kumano, Y. (2018). Energy Learning Progression and STEM Conceptualization Among Pre-service Science Teachers in Japan and Indonesia. New Educational Review, 53(3), 153–162.

Ratna, I. S., yamtinah, S., Ahadi, Masykuri, M., & Shidiq, A. S. (2017). The Implementation of Testlet Assessment Instrument in Solubility and Solubility Product Material for Measuring Students’ Generic Science Skills. Advances in Social Science, Education and Humanities Research (ASSEHR), Volume 158, 158(Ictte), 596–602.

Sahidah Lisdiani, S. A., Setiawan, A., Suhandi, A., Malik, A., Sapriadi, & Safitri, D. (2019). The Implementation of HOT Lab Activity to Improve Students Critical Thinking Skills. Journal of Physics: Conference Series, 1204(1).

Shernoff, D. J., Sinha, S., Bressler, D. M., & Ginsburg, L. (2017). Assessing teacher education and professional development needs for the implementation of integrated approaches to STEM education. International Journal of STEM Education, 4(1), 1–16.

Sumintono, B. (2018). Rasch Model Measurements as Tools in Assesment for Learning. Advances in Social Science, Education and Humanities Research, 173(Icei 2017), 38–42.

Sumintono, B., & Widhiarso, W. (2014). Aplikasi Model Rasch Untuk Penelitian Ilmu-Ilmu Sosial. TrimKom Publising Home.

Sumintono, B., & Widhiarso, W. (2015). Aplikasi Pemodelan RASCH pada Assessment Pendidikan. In TrimKom Publising Home.

Swaffield, S. (2011). Assessment in Education : Principles , Policy & Practice Getting to the heart of authentic Assessment for Learning. Assessment in Education: Principles, Policy & Practice, 18(4), 433–449.

Wahono, B., & Chang, C. Y. (2019). Assessing Teacher’s Attitude, Knowledge, and Application (AKA) on STEM: An Effort to Foster the Sustainable Development of STEM Education. Sustainability (Switzerland), 11(4).

Widhiarso, W., & Sumintono, B. (2016). Examining response aberrance as a cause of outliers in statistical analysis. Personality and Individual Differences, 98, 11–15.

Wilson, M. (2008). Cognitive diagnosis using item response models. Journal of Psychology, 216(2), 74–88.

Winarti, D. W., & Patahuddin, S. M. (2017). Graphic-Rich Items within High-Stakes Tests : Indonesia National Exam ( UN ), PISA , and TIMSS. Proceedings of the 40th Annual Conference of the Mathematics Education Research Group of Australasia, 569–576.

ŽivkoviĿ, S. (2016). A Model of Critical Thinking as an Important Attribute for Success in the 21st Century. Procedia - Social and Behavioral Sciences, 232(April), 102–108.