The Development of Horizontal Anchor Items Test Tool by Rasch Model for Physics National Examination using Macromedia Flash
This study aims to develop a Macromedia Flash-based test device using Horizontal Anchor Items for the National Examination in High School Physics lessons. The research method used in this study is research and development. This research and development uses a qualitative and quantitative approach and uses the Research and Development (R & D) development model of the Dick and Carey model. The results of the validation test of the National Examination test instrument by material experts showed that the National Examination test instrument that had been developed had very good quality (91.25%) when viewed in terms of material. Besides, the results of the validation test of the National Examination test instrument by media experts showed that the National Examination test instruments that had been developed had very good quality (87.5%) when viewed in terms of media. In the first stage of the empirical test, the first reliability test device was quite good, while the second test device showed sufficient reliability. Besides that, the two test kits showed that all items were declared valid. The results of the Person fit for the test respondent of the first test device, and the second test device showed that there were no respondents who were inconsistent, careless, lucky, or cheating. In the second stage of the empirical test, the first test device showed quite good reliability, while the second test device showed sufficient reliability. Besides that, both the test kits show that all items are declared valid. The results of the Person fit for the test respondent of the first test device, and the second test device showed that there were no respondents who were inconsistent, careless, lucky, or cheating. In the third empirical test, the first test device showed good reliability, while the test device showed sufficient reliability. Besides that, the two test kits showed that all items were declared valid. The results of the Person fit for the test respondent of the first test device, and the second test device showed that there were no respondents who were inconsistent, careless, lucky, or cheating. So that the two test kits were declared feasible and could be used for the implementation of the Computer-Based National Examination in Physics.
Abubakar, AS & Adebayo, FO 2014, ‘Using computer based test method for the conduct of examination in Nigeria: Prospects, challenges and strategies’, Mediterranean Journal of Social Sciences, vol. 5, no. 2, pp. 47–56.
Adow, IM, Alio, AA, & Thinguri, R 2015, ‘An assessment of the management of KCSE examination and its influence on irregularities among students: A case of secondary schools in Mandera County, Kenya’, Journal of Education and Practice, vol. 6, no. 28, pp.15-22.
Arıkan, ÇA & Gelbal, S 2018, ‘The effect of mini and midi anchor tests on test equating’, International Journal of Progressive Education, vol. 14, no. 2, pp.148-60.
Aşiret, S & Sünbül, SÖ 2016, ‘investigating test equating methods in small samples through various factors’, Educational Sciences: Theory & Practice, vol. 16, no. 2, pp. 647-68.
Balogun, AG & Olanrewaju, AS 2016, ‘Role of computer self-efficacy and gender in computer-based test anxiety among undergraduates in Nigeria’, Psychological Thought, vol. 9, no. 1, pp. 58-66.
Chan, SW, Ismail, Z, & Sumintono, B 2014, ‘A Rasch model analysis on secondary students’ statistical reasoning ability in descriptive statistics’, Procedia-Social and Behavioral Sciences, vol. 129, pp. 133-139.
Darmawan, D & Harahap, E 2016, ‘Communication strategy for enhancing quality of graduates nonformal education through computer based test (CBT) in West Java Indonesia’, International Journal of Applied Engineering Research (IJAER), vol. 11, no. 15, pp. 8641-5.
Gall, MD, Gall, JP, & Borg, WR 2003, Educational research: an introduction, Pearson Education Inc, Boston.
Gonz, J 2014, ‘SNSequate : Standard and nonstandard statistical models and methods for test equating’, Journal of Statistical Software, vol. 59, no. 7, pp. 1-30.
Gübeş, NÖ & Kelecioğlu, H 2016, ‘The impact of test dimensionality, common-item set format, and scale linking methods on mixed-format test equating’, Educational Sciences: Theory & Practice, vol. 16, no. 3, pp. 715-734.
Hagell, P 2014, ‘Testing rating scale unidimensionality using the principal component analysis (PCA)/ t-test protocol with the Rasch model: The primacy of theory over statistics’, Open Journal of Statistics, vol. 4, no. 6, pp. 456-465.
Hosseini, M, Abidin, MJZ, & Baghdarnia, M 2014, ‘Comparability of test results of computer based tests (CBT) and paper and pencil tests (PPT) among English language learners in Iran’, Procedia-Social and Behavioral Sciences, vol. 98, no. 6, pp 659-667.
Irawan, C 2015, ‘The national examination and the quality of education mapping’, Indonesian Journal of Educational Review, vol. 2, no. 1, pp 97-105.
Jamiludin, D & Uke, WAS 2017, ‘Students’ perception towards national examination 2017: computer-based test or paper-based test’, Mediterranean Journal of Social Sciences, vol. 8, no. 4, pp. 139-144.
Kementerian Pendidikan Nasional 2016, Government regulations on education and culture number 23 of 2016, Kemdikbud, Jakarta.
Khoshsima, H, Hosseini, M, & Toroujeni, SMH 2017, ‘Cross-mode comparability of computer-based testing (CBT) versus paper-pencil based testing (PPT): An investigation of testing administration mode among iranian intermediate EFL learners’, English Language Teaching, vol. 10, no. 2, pp 23-32.
Lin, P, Dorans, N, & Weeks, J 2016, ‘Linking composite scores: effects of anchor test length and content representativeness’, ETS Research Report Series, vol. 2016, no. pp. 1–21.
Lu, R & Guo, H 2018, ‘A simulation study to compare nonequivalent groups with anchor test equating and pseudo-equivalent group linking’, ETS Research Report Series, vol. 2018, no. 1, pp. 1-16.
Lu, R, Haberman, S, Guo, H, & Liu, J 2015, ‘Use of jackknifing to evaluate effects of anchor item selection on equating with the nonequivalent groups with anchor test (NEAT) design’, ETS Research Report Series, vol. 2015, no.1, pp. 1-12.
Mursidi, A & Soeharto, S 2017, ‘An introduction: Evaluation of quality assurance for higher educational institutions using rasch model’, Journal of Education, Teaching and Learning, vol. 1, no. 1, pp. 1-6.
Nguyen, C, Griffin, P, & Wu, M 2018, ‘Test equating for measuring system progress in longitudinal surveys of student academic achievement’, Journal of Physics: Conference Series, vol. 1044, no. 1, p. 012064.
Olawuyi, OF, Tomori, RA, & Bamigboye, OO 2018, ‘Students’ suitability of computer based test (CBT) mode for undergraduate courses in nigerian universities : a case study of university of ilorin’, International Journal of Educational Sciences, vol. 20, no. (1-3), pp.18–24.
Rahmani, BD 2018, ‘Differential item functional analysis on pedagogic and content knowledge (PCK) questionnaire for Indonesian teachers using RASCH model’, Journal of Physics: Conference Series, vol. 948, no. 1, p. 012061.
Reid, J, Robinson, D, & Lewis, C 2016, ‘Assessing the evidence: Student response system versus computer based testing for undertaking multiple choice question assessment in undergraduate nursing education’, Pediatrics and Neonatal Nursing-Openventio Publishers, vol. 3, no. 1, pp. 10-14.
Suranata, K et al. 2018, ‘Diagnosis of students zone proximal development on math design instruction : A Rasch analysis’, Journal of Physics: Conference Series, vol. 1114, no. 1, p. 012034.
Uysal ,İ & Kilmen, S 2016, ‘Comparison of item response theory test equating methods for mixed format tests’, International Online Journal of Educational Sciences, vol. 8, no. 2, pp. 1-11.
Zamri, A & Nordin, 2015, ‘Modeling a multiple choice mathematics test with the rasch model’, Indian Journal of Science and Technology, vol. 8, no. 12, p. 70650.
Copyright (c) 2020 Jurnal Penelitian & Pengembangan Pendidikan Fisika
This work is licensed under a Creative Commons Attribution 4.0 International License.