From Computer Adaptive Testing to Automated Scoring in a Senior Secondary School Physics Essay Test in Osun State, Nigeria

From Computer Adaptive Testing to Automated Scoring in a Senior Secondary School Physics Essay Test in Osun State, Nigeria

[featured_image]
  • Version
  • Download 75
  • File Size 708.92 KB
  • File Count 1
  • Create Date August 2, 2018
  • Last Updated August 2, 2018

From Computer Adaptive Testing to Automated Scoring in a Senior Secondary School Physics Essay Test in Osun State, Nigeria

Many teachers see teaching as enjoyable but scoring as a task they would like to avoid. This implies that many teachers would prefer automated scoring to manual. Computer-based testing supports the use of multiple-choice, drag-and-drop and fill-in-the-blank tests in objective type of testing as well as essay test items. With a well-designed protocol, this form of assessment provides instant feedback to students which allows them monitor their academic progress. It should be noted that scoring objective tests such as supply (completion and fill-in) and select (yes/no, matching and multiple choice item) is not challenging. Many examination bodies can score with little or no support from the manufacturers of such scoring software. By contrast, essay tests, which can be categorized into restricted and extended response tests, do not have the characteristic of a “,one cap fits all”, approach. Each question has to have its own software. Software designed for a particular examination cannot be used for another examination even in the same subject. A battery of physics tests was prepared using an appropriate text editor and uploaded to the internet. Students were expected to answer the essay questions and submit their responses online. Software (based on moodle platform) was prepared to score each student’,s responses by extracting predetermined features from their responses. The number of the predetermined features extracted from the students’, responses corresponded to their scores depending on the predetermined marks for each of the features. In order to validate the automated scoring, a manual marking was done using a prepared marking guide. The two sets of scores were correlated and a positive and high correlation was established between the automated and manual scorings. Therefore, we advocate automated scoring.Keywords: Computer Adaptive Testing, Automated Scoring, Moodle, Learning Management System, Physics Essay Test

Attached Files

FileAction
paper_5b943cc.pdfDownload