Psychometric Analysis of Senior Secondary School Certificate Examination (SSCE) 2017 Neco English Language Multiple Choice Test Items in Kwara State Using Item Response Theory

  • Kasali Jimoh
  • Oluwaseyi Aina Opesemowo Department of Educational Foundation, Faculty of Education, Ajayi Crowther University, Oyo
  • Yinusa Akintomide Faremi Department of Educational Foundations and Management University of Eswatini Kwaluseni Campus, Eswatini
Keywords: Carelessness, Dimensionality, Discrimination, Difficulty, Guessing Item response theory

Abstract

Abstract Views: 234

The study determined the  dimensionality  of  2017  National  Examinations  Council (NECO) English Language multiple-choice test item and estimated the   item   parameter   indices   (discrimination,   difficulty,   guessing   and   carelessness) using four parameter logistic model.  The ex-post facto design was  employed  for  the  study.  The  population  for  the  study  comprised  all  candidates/test-takers  who  enrolled  and  sat  for  June/July  Senior  School  Certificate    Examination    (SSCE)    2017    NECO    English    Language    Examination  in  Kwara  State,  Nigeria  with  12,000  samples  purposively  selected  from  sixteen  Local  Government  Area  in  the  State.    The  research  instruments  used  for  the  study  were  Optical  Marks  Record  Sheets  for  the  NECO  June/July  2017  English  Language  objectives  items.  The responses  of   the   testees   were   scored   dichotomously.   The   data   collected   were   calibrated using four parameters logistic model. The results showed that the 2017  English  Language  multiple-choice  item  among  SSCE  students  in  Kwara  State  does  not  violate  the  assumption  of  unidimensionality which made  the  items  reliable  for  use  in  assessing  knowledge  of  students  in  English language. Also, the results showed that only two items were able to suit the 4-PLM based on the rule of thumb. While the remaining items does not suit the 4-PLM.  It  was  recommended  among  others  that  NECO  and  other  examination  bodies  should  intensify  more  efforts  toward  improving  the standard of the English Language test items using 4-PLM, which is the new trend for estimating item parameter indices

 

Downloads

Download data is not yet available.

References

Ackerman, T. A., Gierl, M. J., & Walker, C. M. (2003). Using multidimensional item response theory to evaluate educational and psychological tests. Educational Measurement: Issues and Practice: MIRT Instructional Module.

Adedoyin, O. O., & Adedoyin, J. A. (2013). Assessing the comparability between classical test theory (CTT) and Item response theory (IRT) models in estimating test item parameters. Herald Journal of Education and General Studies, 2(3), 107-114.

Adegbile, J. A. (1999). The relative effectiveness of three models of expository advance organiser on secondary students’ learning outcomes in reading comprehension. Unpublished Ph.D Thesis, Univesity of Ibadan, Ibadan.

Adeyemo, E. O. & Opesemowo, O. A. (2020). Differential Test let Functioning (DTLF) in Senior School Certificate English Language Examination Using Multilevel Measurement Modelling. Sumerianz Journal of Education, Linguistics and Literature, 3(11), 249-253. https://doi.org/10.47752/sjell.311.249.253

Amarnani, R. (2009). Two theories, one theta: A gentle introduction to item response theory as an alternative to classical test theory. The

International Journal of Educational and Psychological Assessment, 3, 104-109.

Atanda A.I. (2011). A Survey of Secondary Students Achievement in English Language and Mathematics in Nigeria: Lessons for Secondary School Administrators in Nigeria. Journal of Sociology and Education in Africa, 10 (2), 126-147

Baker, F. B. & Kim, S. (2004). Item response theory: Parameter estimation techniques. 2nd ed. New York Marcel Dekker.

Baker, F. B. (2001). The basics of item response theory (2nd ed.). United States of America: ERIC Clearing House on Assessment and Evaluation.

Baker, F. B., & Kim, S. H. (2004). Item response theory: parameter estimation techniques (2nded.). New York: Marcel Dekker.

Barton, M. A., & Lord, F. M. (1981). An upper asymptote for the three- parameter logistic item-response model (Research Report 18-21). Princeton, NJ: Educational Testing Service. doi: 10.1002/j.2333- 8504.1981.tb01255.x

David, A. A., Henry, O. O., Mayowa, O. O., Joseph, T. A., & Samuel, T.

B. (2017). Analysis of the dimensionality of Nigerian Senior School Certificate Examination June/July 2013/2014 objective test in Government. Journal of Science and Technology, Mathematics and Education, 13(4), 83-93

Deng, N., Wells, C., & Hambleton, R. (2008). A confirmatory factor analytical study examining the dimensionality of educational achievement tests. NERA Conference proceedings 2008 paper 31. http://digitalcommons.ucom.edu/nera 2008/31

Jang, E. E., & Roussos, L. (2007). An investigation into the dimensionality of TOEFL using conditional covariance-based nonparametric approach. Journal of Educational Measurement, 44, 1-22.

Jiao, H. (2004). Evaluating the dimensionality of the Michigan English Language assessment battery. Spain Fellow Working papers in Second or Foreign Language Assessment, 2, 27–51.

Jiao, H. (2004). Evaluating the dimensionality of the Michigan English Language assessment battery. Spaan Fellow Working papers in Second or Foreign Language Assessment, 2, 27–51.

Jimoh, K. (2021). Gender and culture-related differential item functioning in 2016 National Examinations Council Mathematics multiple choice questions in Nigeria. Unpublished Ph.D. Thesis, Obafemi Awolowo University, Ile-Ife.

Jimoh, K., & Adediwura. A. A. (2022). Estimation of Item Parameter Indices of NECO Mathematics Multiple Choice Test Items Among Nigerian Students. Journal of Integrated Elementary Education, 2(1), 43-54. https://doi.org/10.21580/jieed.v2i1.10187.

Kyung, T. H. (2013). Windows software that generates IRT parameters and item responses: research and evaluation program methods (REMP). Amherst: University of Massachusetts.

Li, Y., Jiao, H., & Lissitz, R. W. (2012). Applying multidimensional item response theory models in validity test dimensionality: An example K-

large-scale science assessment. Journal of Applied Testing Technology, 13,(2), 1-27.

Liao, W. W., Ho, R. G., Yen, Y. C., & Cheng, H. C. (2012). The four- parameter logistic item response theory model as a robust method of estimating ability despite aberrant responses. Social Behavior and Personality: An International Journal, 40(10), 1679-1694. doi: 10.2224/sbp.2012.40.10.1679

Loken, E. & Rulison, K. L. (2010). Estimation of a 4-parameter Item Response Theory model. The British Journal of Mathematical and Statistical Psychology, 63(3), 509-525.

doi:10.1348/000711009X474502,

Lord, F. M. (2012). Applications of item response theory to practical testing problems. New Jersey, NJ: Lawrence Erlbaum Associates

Magis, D. (2013). A note on the item information function of the four- parameter logistic model. Applied Psychological Measurement, 37(4), 304-315. doi: 10.1177/0146621613475471

McDonald, R. P. (2000). A basis for multidimensional item response theory. Applied Psychological Measurement, 24, 99 - 114.

Meng, X., Xu, G., Zhang, J., & Tao, J. (2019). Marginalized maximum a posteriori estimation for the four-parameter logistic model under a mixture modelling framework. British Journal of Mathematical and Statistical Psychology, Advanced online publication. doi: 10.1111/bmsp.12185

Ojerinde, D. (2012). Introduction to item response theory, parameter models, estimation and application. Abuja, Nigeria: Marvellous Press.

Reeve, B. B. (2000). Item and scale-level analysis of clinical and non- clinical sample responses to the MMPI-2 depression scales employing item response theory (Unpublished doctoral dissertation). University of North Carolina, Chapel Hill.

Smith, J. (2009). Some issues in item response theory: Dimensionality assessment and models for guessing. Unpublished Doctoral Dissertation. University of South Califonia.

Stivena, D. (2011). Assessing dimensionality in complex data structures: A performance comparison of DETECT and NOHARM procedures. Unpublished Ph.D. Dissertation, Arizona State University.

Tomblin, J. B., & Zhang, X. (2006). The dimensionality of language ability in school–age children. Journal Language and Hearing of Speech, Research Vol. 49. 1193-1

Wilson, K. M. (2000). An explanatory dimensionality assessment of the TOEFL test. Educational Testing Service; Research Report, RR-00- 14.Princeton, New Jersey.

Wright, B. D. (1999). Fundamental measurement for psychology. In: S. E. Embretson & S. L. Hershberger (Eds.). The new rules of measurement. (pp. 65-104). Mahwah, N J: Lawrence Erlbaum Associates.

Published
2022-12-09
How to Cite
Jimoh, K., Aina Opesemowo, O., & Akintomide Faremi, Y. (2022). Psychometric Analysis of Senior Secondary School Certificate Examination (SSCE) 2017 Neco English Language Multiple Choice Test Items in Kwara State Using Item Response Theory. Journal of Applied Research and Multidisciplinary Studies, 3(2). https://doi.org/10.32350/jarms.32.01