Quality of multiple-choice items used in a Progress Test

Authors

DOI:

https://doi.org/10.18222/eae.v33.7533

Keywords:

Higher Education , Learning Assessment , Multiple Choice Test

Abstract

This article aims to analyze and discuss the quality of multiple-choice items prepared for a Progress Test, through the quali-quantitative analysis of 100 items. The results show a predominance of the dimension of conceptual knowledge and the cognitive domain synthesized by the verb “understand”, analyzed by the Revised Bloom Taxonomy, and 44% of the items are in accordance with the technical guidelines for the elaboration of multiple-choice items. According to Classical Test Theory, KR20 was 0.835, difficulty (P) 0.496, and discrimination index 0.240. The test presented a good reliability index, with suitable item difficulty. Special attention should be given to the discrimination index and the writing of the questions in order to improve these parameters for the items in general.

Downloads

Download data is not yet available.

References

ABDULGHANI, Hamza Mohammad et al. Effectiveness of longitudinal faculty development programs on MCQs items writing skills: A follow-up study. PLoS One, v. 12, e0185895, Oct. 2017. Disponível em: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0185895. Acesso em: 8 jan. 2018.

ALI, Syed Haris; RUIT, Kenneth G. The impact of item flaws, testing at low cognitive level, and low distractor functioning on multiple-choice question quality. Perspectives on Medical Education, v. 4, n. 5, p. 244-251, Sept./Oct. 2015. Disponível em: https://www.ncbi.nlm.nih.gov/pmc/articles/ PMC4602009/?report=reader. Acesso em: 23 jan. 2022.

BICUDO, Angélica Maria et al. Teste de Progresso em Consórcios para Todas as Escolas Médicas do Brasil. Revista Brasileira de Educação Médica, Brasília, v. 43, n. 4, p. 151-156, out./dez. 2019.

BOLLELA, Valdes Roberto et al. Avaliação somativa de habilidades cognitivas: experiência envolvendo boas práticas para a elaboração de testes de múltipla escolha e a composição de exames. Revista Brasileira de Educação Médica, Brasília, v. 42, n. 4, p. 74-85, out./dez. 2018.

BORGATTO, Adriano Ferreti; ANDRADE, Dalton Francisco de. Análise clássica de testes com diferentes graus de dificuldade. Estudos em Avaliação Educacional, São Paulo, v. 23, n. 52, p. 146-156, maio/ago., 2012.

FERRAZ, Ana Paula do Carmo Marcheti; BELHOT, Renato Vairo. Taxonomia de Bloom: revisão teórica e apresentação das adequações do instrumento para definição de objetivos instrucionais. Gestão & Produção, São Carlos, SP, v. 17, n. 2, p. 421-431, jun. 2010.

HALADYNA, Thomas M.; DOWNING, Steven M.; RODRIGUEZ, Michael C. A review of multiple- -choice item-writing guidelines for classroom assessment. Applied Measurement in Education, Londres, v. 15, n. 3, p. 309-334, jul. 2002.

HAMAMOTO FILHO, Pedro Tadao; BICUDO, Angélica Maria. Improvement of faculty’s skills on the creation of items for Progress Testing through feedback to item writers: a successful experience. Revista Brasileira de Educação Médica, Brasília, v. 44, n. 1, e018, 2020.

HINGORJO, Mozaffer Rahim; JALEEL, Farhan. Analysis of one-best MCQs: the difficulty index, discrimination index and distractor efficiency. Journal of Pakistan Medical Association, Karachi, v. 62, n. 2, p. 142-147, Feb. 2012.

ITEMAN: Software for Classical Analysis Copyright © 2013 – Assessment Systems Corporation. Relatório Prova de Progresso Estética, abr. 2019.

JAVAEED, Arslaan. Assessment of higher ordered thinking in medical education: multiple choice questions and modified essay questions. MedEdPublish, United Kingdon, v. 7, n. 128, p. 11-18, June 2018. Disponível em: https://mededpublish.org/articles/7-128. Acesso em: 23 jan. 2022.

KHAN, Humaira Fayyaz et al. Identification of technical item flaws leads to improvement of the quality of single best Multiple-Choice Questions. Pakistan Journal of Medical Sciences, Pakistan, v. 29, n. 3, p. 715-718, May/June 2013.

KRATHWOHL, David R. A Revision of Bloom’s Taxonomy: an overview. Theory into Practice, London, v. 31, n. 4, p. 212-218, Mar./Apr. 2002.

LINNETTE D’SA, Juliana; VISBAL-DIONALDO, Maria Liza. Analysis of multiple-choice questions: item difficulty, discrimination index and distractor efficiency. International Journal of Nursing Education, India, v. 9, n. 3, p. 109-114, July 2017.

NEDEAU-CAYO, Rosemarie et al. Assessment of item-writing flaws in multiple-choice questions. Journal for Nurses in Professional Development, USA, v. 29, n. 2, p. 52-57, Mar./Apr. 2013.

PANIAGUA, Miguel A. et al. Construindo as perguntas do teste escrito para ciências básicas e clínicas. NBME: Philadelphia, 2016.

RUSH, Bonnie R. et al. The impact of item-writing flaws and item complexity on examination item difficulty and discrimination value. BMC Medical Education, United Kingdon, v. 16, n. 250, p. 1-10, Sept. 2016.

SARTES, Laisa Marcorela Andreoli; SOUZA-FORMIGONI, Maria Lucia Oliveira de. Avanços na psicometria: da Teoria Clássica dos Testes à Teoria de Resposta ao Item. Psicologia: Reflexão e Crítica, Porto Alegre, v. 26, n. 2, p. 241-250, jul. 2013.

SCHUWIRTH, Lambert W. T.; VAN DER VLEUTEN, Cees P. M. General overview of the theories used in assessment: AMEE Guide No. 57. Medical Teacher, United Kingdom, v. 33, p. 783-797, Sept. 2011.

SCULLY, Darina. Constructing multiple-choice items to measure higher-order thinking. Practical Assessment Research & Evaluation, v. 22, n. 4, p. 1-13, May 2017. Disponível em: http://pareonline.net/getvn.asp?v=22&n=4. Acesso em: 24 jan. 2022.

SILVA, Vailton Afonso da; MARTINS, Maria Inês. Análise de questões de física do ENEM pela taxonomia de Bloom revisada. Ensaio: Pesquisa em Educação e Ciências, Belo Horizonte, v. 16, n. 3, p. 189-202, set./dez. 2014.

TARIQ, Saba et al. Evaluation of cognitive levels and item writing flaws in medical pharmacology internal assessment examinations. Pakistan Journal of Medical Sciences, Pakistan, v. 33, n. 4, p. 866-870, July/Aug. 2017.

TARRANT, Marie; WARE, James. Impact of item-writing flaws in multiple-choice questions on student achievement in high-stakes nursing assessments. Medical Education, Oxford, v. 42, n. 2, p. 198-206, Feb. 2008.

TARRANT, Marie et al. The frequency of item writing flaws in multiple-choice questions used in high stakes nursing assessments. Nurse Education Today, USA, v. 26, n. 8, p. 662-671, Dec. 2006.

TAVAKOL, Mohsen; DENNICK, Reg. Post-examination analysis of objective tests. Medical Teacher, United Kingdom, v. 33, n. 6, p. 447-458, May 2011.

TAVAKOL, Mohsen; DENNICK, Reg. Post-examination interpretation of objective test data: monitoring and improving the quality of high–stakes examinations – a commentary on two AMEE Guides. Medical Teacher, United Kingdom, v. 34, n. 3, p. 245-248, Feb. 2012.

TOMBI, Elen Cristina Nascimento de Araújo. Estudo dos itens de múltipla escolha de um teste de progresso aplicado em um curso de Estética de uma universidade paulistana. 2019. 89f. Dissertação (Mestrado Profissional) – Programa de Pós-Graduação em Ciências da Saúde, Universidade Federal de São Paulo, São Paulo, 2019.

VANDERBILT, Allison A.; FELDMAN, Moshe; WOOD, Isaac K. Assessment in undergraduate medical education: a review of course exams. Medical Education Online, v. 18, n.1, p. 1-5, Mar. 2013.

ZUKOWSKY-TAVARES, Cristina. Formação em avaliação como um caminho para a profissionalização docente. Revista Lusófona de Educação, Lisboa, v. 16, n. 16, p. 59-74, ago. 2010.

ZUKOWSKY-TAVARES, Cristina; LIMEIRA, Polyana de Castro; RUIZ-MORENO, Lídia. O portfólio e a construção de saberes docentes na pós-graduação em saúde. Pro-Posições, Campinas, SP, v. 30, e20170181, 2019.

Published

2022-02-10

How to Cite

Tombi, E. C. N. de A., Zukowsky-Tavares, C., & Ferreira-Gerab, I. (2022). Quality of multiple-choice items used in a Progress Test . Estudos Em Avaliação Educacional, 33, e07533. https://doi.org/10.18222/eae.v33.7533

Issue

Section

Articles