Examining the Psychometric Properties of the Career Commitment Instrument through Classical Test Theory and the Graded Response Model
DOI:
https://doi.org/10.23887/jere.v7i3.59619Kata Kunci:
Psychometric Properties, Career of Commitmen, Classical Test Theory, Graded Response ModelAbstrak
The gaps that exist in the literature by conducting a comprehensive evaluation of the psychometric properties of the Career Commitment Instrument using CTT and GRM. This exploratory study aims to analyze the psychometric properties of the career commitment instrument using a classical test theory approach and graded response model. Research using cross-sectional design. Data were obtained from a career commitment questionnaire with 12 statement items and 250 respondents who were randomly selected. The summed ranking method (Likert) was used for scaling, with five response options. Data were analyzed using R Studio's traditional Graded Response Model Theory Test technique. Based on the findings, the quality of the career commitment instrument has an estimated reliability of 0.77 (reliable) and a standard measurement error of 3.3. This instrument has a good Endorsement Index and Discrimination Index, with a classic and modern approach. Furthermore, analysis of the stratified response model revealed that 10 items were suitable and 2 were not. If given to respondents with a low level of ability (θ=-2) to a high level of ability (θ=2), this instrument provides complete information of 58.93 with a standard error of 1.0. This instrument can be used by companies to assess the career commitment of their employees. Future research can test the convergent and divergent validity of the career commitment instrument with similar instruments or with different instruments, to strengthen its validity.
Referensi
Azevedo, J. M., Oliveira, E. P., & Beites, P. D. (2019). Using learning analytics to evaluate the quality of multiple-choice questions: A perspective with classical test theory and item response theory. International Journal of Information and Learning Technology, 36(4), 322–341. https://doi.org/10.1108/IJILT-02-2019-0023.
Bellamkonda, N., & Pattusamy, M. (2022). Validation of fear of covid-19 scale in india: Classical test theory and item response theory approach. International Journal of Mental Health and Addiction, 20(4), 2400–2407. https://doi.org/10.1007/s11469-021-00521-2.
Carson, K. D., & Bedeian, A. G. (1994). Career commitment: Construction of a measure and examination of its psychometric properties. Journal of Vocational Behavior, 44(3), 237–262. https://doi.org/10.1006/jvbe.1994.1017.
Cicek, I., Karaboga, T., & Sehitoglu, Y. (2016). A New Antecedent of Career Commitment: Work to Family Positive Enhancement. Procedia - Social and Behavioral Sciences, 229, 417–426. https://doi.org/10.1016/j.sbspro.2016.07.152.
Creswell, J. W. (2012). Educational Research Planning, Conducting and Evaluating Quantitative and Qualitative Research (Fourth). Pearson Education, Inc..
Dai, S., Vo, T. T., Kehinde, O. J., He, H., Xue, Y., Demir, C., & Wang, X. (2021). Performance of Polytomous IRT Models With Rating Scale Data: An Investigation Over Sample Size, Instrument Length, and Missing Data. Frontiers in Education, 6(September), 1–18. https://doi.org/10.3389/feduc.2021.721963.
Debelak, R., & Koller, I. (2020). Testing the local independence assumption of the rasch model with Q3-based nonparametric model tests. Applied Psychological Measurement, 44(2), 103–117. https://doi.org/10.1177/0146621619835501.
Deepa, S. R. (2018). A Study on Career Commitment of Teaching Profession in Chennai City. Journal of Management (JOM), 5(3), 45–51. https://www.academia.edu/download/56721021/JOM_05_03_007.pdf.
Dorfman, L. Y., & Kalugin, A. Y. (2020). Resources, potentials and academic achievements of students: Part 1. Differentiation of resources and potentials [Соотношение ресурсов, потенциалов и академических достижений студентов cообщение 1. дифференциация ресурсов и потенциалов]. Obrazovanie i Nauka, 22(4), 64–88. https://doi.org/10.17853/1994-5639-2020-4-64-88.
Eleje, L. I., Onah, F. E., & Abanobi, C. C. (2018). Comparative study of classical test theory and item response theory using diagnostic quantitative economics skill test item analysis results. European Journal of Educational & Social Sciences, 3(1), 57–75. https://www.researchgate.net/profile/Lydia-.Eleje/publication/343557487_Comparative_study_of_classical_test_theory_and_item_response_theory_using_diagnostic_quantitative_economics_skill_test_item_analysis_results/links/5f317deea6fdcccc43bedf4a/Comparative-study-of-classical-test-theory-and-item-response-theory-using-diagnostic-quantitative-economics-skill-test-item-analysis-results.pdf.
Foster, R. C. (2020). A generalized framework for classical test theory. Journal of Mathematical Psychology, 96, 102330. https://doi.org/10.1016/j.jmp.2020.102330.
Fu, J. R., & Chen, J. H. F. (2015). Career commitment of information technology professionals: The investment model perspective. Information and Management, 52(5), 537–549. https://doi.org/10.1016/j.im.2015.03.005.
Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2017). Multivariate data analysis. In Pearson. https://doi.org/10.1002/9781118895238.ch8.
Hambleton, R. K., & Swaminathan, H. (1985). Item tesponse theory: Principles and applications. In Applied Psychological Measurement (Vol. 9, Issue 3). Springer Science+Business Media New York. https://doi.org/10.1177/014662168500900315.
Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. In SAGE Publications, Inc. (Vol. 29, Issue 07). https://doi.org/10.5860/choice.29-4185.
Himelfarb, I. (2019). A primer on standardized testing: History, measurement, classical test theory, item response theory, and equating. Journal of Chiropractic Education, 33(2), 151–163. https://doi.org/10.7899/JCE-18-22.
Hu, S., Cai, W., Gao, T., & Wang, M. (2022). An automatic residual-constrained and clustering-boosting architecture for differentiated heartbeat classification. Biomedical Signal Processing and Control, 77. https://doi.org/10.1016/j.bspc.2022.103690.
Jimam, N. S., Ahmad, S., & Ismail, N. E. (2019). Psychometric Classical Theory Test and Item Response Theory Validation of Patients’ Knowledge, Attitudes and Practices of Uncomplicated Malaria Instrument. Journal of Young Pharmacists, 11(2), 186–191. https://doi.org/10.5530/jyp.2019.11.39.
Jones, M. L., Zanko, M., & Kriflik, G. (2006). On the antecedents of career commitment. Proceedings of the Australian and New Zealand Academy of Management Conference, 1–22. https://ro.uow.edu.au/commpapers/1895/.
Kim, J. (Sunny), Milliman, J. F., & Lucas, A. F. (2021). Effects of CSR on affective organizational commitment via organizational justice and organization-based self-esteem. International Journal of Hospitality Management, 92(September 2020), 102691. https://doi.org/10.1016/j.ijhm.2020.102691.
Kline, R. B. (2011). Principles and practice of structural equation modeling. In The Guilford Press (Vol. 245). The Guilford Press. https://doi.org/10.1097/00003086-198908000-00042.
Linden, W. J. van der, & Hambleton, R. K. (1997). Handbook of modern item response theory. In Springer Science+Business Media, LLC. Springer Science+Business Media, LLC. https://doi.org/10.1007/978-1-4757-2691-6.
M.Najib, H. H., & Aljanabi, A. R. A. (2020). The Mediation Role of Career Adaptability between Career Commitment and Career Motivation: An Empirical Investigation. Information Management and Business Review, 12(1), 27–40. https://doi.org/10.22610/imbr.v12i1(I).3033.
Mamun, M. A., Alimoradi, Z., Gozal, D., Manzar, M. D., Broström, A., Lin, C. Y., Huang, R. Y., & Pakpour, A. H. (2022). Validating insomnia severity index (ISI) in a bangladeshi population: Using classical test theory and rasch analysis. International Journal of Environmental Research and Public Health, 19(1). https://doi.org/10.3390/ijerph19010225.
Mateucci, M., & Stracqualursi, L. (2006). Student assessment via graded response model. STATISTICA, 66(4), 435–447. https://doi.org/10.6092/issn.1973-2201/1216.
Nima, A. Al, Cloninger, K. M., Lucchese, F., Sikström, S., & Garcia, D. (2020). Validation of a general subjective well-being factor using Classical Test Theory. PeerJ, 2020(6), 1–29. https://doi.org/10.7717/peerj.9193.
Nur, L., Nurani, L. A., Suryana, D., & Ahmad, A. (2020). Rasch model application on character development instrument for elementary school students. International Journal of Learning, Teaching and Educational Research, 19(3), 437–459. https://doi.org/10.26803/ijlter.19.3.24.
Polat, M. (2022). Comparison of Performance Measures Obtained from Foreign Language Tests According to Item Response Theory vs Classical Test Theory. International Online Journal of Education and Teaching, 9(1), 471–485. https://eric.ed.gov/?id=EJ1327729.
Reise, S. P., Du, H., Wong, E. F., Hubbard, A. S., & Haviland, M. G. (2021). Matching IRT Models to Patient-Reported Outcomes Constructs: The Graded Response and Log-Logistic Models for Scaling Depression. Psychometrika, 86(3), 800–824. https://doi.org/10.1007/s11336-021-09802-0.
Rogers, G., & Badham, L. (2003). Evaluation in schools: Getting started with training and implementation. In Routledge. Routledge. https://doi.org/10.4324/9780203393314.
Rubio, V. J., Aguado, D., Hontangas, P. M., & Hernández, J. M. (2007). Psychometric properties of an emotional adjustment measure: An application of the graded response model. European Journal of Psychological Assessment, 23(1), 39–46. https://doi.org/10.1027/1015-5759.23.1.39.
Scotti di Uccio, U., Colantonio, A., Galano, S., Marzoli, I., Trani, F., & Testa, I. (2019). Design and validation of a two-tier questionnaire on basic aspects in quantum mechanics. Physical Review Physics Education Research, 15(1), 1–25. https://doi.org/10.1103/physrevphyseducres.15.010137.
Sethar, W. A., Pitafi, A., Bhutto, A., Nassani, A. A., Haffar, M., & Kamran, S. M. (2022). Application of Item Response Theory (IRT)-Graded Response Model (GRM) to Entrepreneurial Ecosystem Scale. Sustainability (Switzerland), 14(9), 1–27. https://doi.org/10.3390/su14095532.
Silvia, P. J., Rodriguez, R. M., Beaty, R. E., Frith, E., Kaufman, J. C., Loprinzi, P., & Reiter-Palmon, R. (2021). Measuring everyday creativity: A Rasch model analysis of the biographical inventory of creative behaviors (BICB) scale. Thinking Skills and Creativity, 39(February), 100797. https://doi.org/10.1016/j.tsc.2021.100797.
Singhal, H., & Rastogi, R. (2018). Psychological capital and career commitment: the mediating effect of subjective well-being. Management Decision, 56(2), 458–473. https://doi.org/10.1108/MD-06-2017-0579.
Sorenson, B., & Hanson, K. (2021). Using classical test theory and rasch modeling to improve general chemistry exams on a per instructor basis. Journal of Chemical Education, 98(5), 1529–1538. https://doi.org/10.1021/acs.jchemed.1c00164.
Sultana, R., Yousaf, A., Khan, I., & Saeed, A. (2016). Probing the interactive effects of career commitment and emotional intelligence on perceived objective/subjective career success. Personnel Review, 45(4), 724–742. https://doi.org/10.1108/PR-11-2014-0265.
Tabaku, E., & Cerri, S. (2016). An Assessment of Service Quality and Customer Satisfaction in the Hotel Sector. Tourism & Hospitality Industry 2016, 12(1), 480–489. https://search.proquest.com/openview/0f552061d295e83a890565876823b237/1?pq-origsite=gscholar&cbl=286208.
van der Lans, R. M., van de Grift, W. J. C. M., & van Veen, K. (2018). Developing an instrument for teacher feedback: Using the rasch model to explore teachers’ development of effective teaching strategies and behaviors. Journal of Experimental Education, 86(2), 247–264. https://doi.org/10.1080/00220973.2016.1268086.
Widyaningsih, S. W., Yusuf, I., Prasetyo, Z. K., & Istiyono, E. (2021). The development of the hots test of physics based on modern test theory: Question modeling through e-learning of moodle lms. International Journal of Instruction, 14(4), 51–68. https://doi.org/10.29333/iji.2021.1444a.
Yuan, T., Honglei, Z., Xiao, X., Ge, W., & Xianting, C. (2021). Measuring perceived risk in sharing economy: A classical test theory and item response theory approach. International Journal of Hospitality Management, 96(April), 102980. https://doi.org/10.1016/j.ijhm.2021.102980.
Yuen, K. F., Loh, H. S., Zhou, Q., & Wong, Y. D. (2018). Determinants of job satisfaction and performance of seafarers. Transportation Research Part A: Policy and Practice, 110(November 2017), 1–12. https://doi.org/10.1016/j.tra.2018.02.006.
Zhu, D., Kim, P. B., Milne, S., & Park, I. J. (2021). A Meta-Analysis of the Antecedents of Career Commitment. Journal of Career Assessment, 29(3), 502–524. https://doi.org/10.1177/1069072720956983.
Unduhan
Diterbitkan
Cara Mengutip
Terbitan
Bagian
Lisensi
Hak Cipta (c) 2023 Muh. Asriadi AM, Farida Agus Setiawati
Artikel ini berlisensiCreative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with the Journal of Evaluation and Research in Education (JERE) agree to the following terms:
- Authors retain copyright and grant the journal the right of first publication with the work simultaneously licensed under a Creative Commons Attribution License (CC BY-SA 4.0) that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work. (See The Effect of Open Access)