Use this URL to cite or link to this record in EThOS: http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.270683
Title: Student modelling by adaptive testing : a knowledge-based approach
Author: Abdullah, Sophiana Chua
Awarding Body: University of Kent at Canterbury
Current Institution: University of Kent
Date of Award: 2003
Availability of Full Text:
Access through EThOS:
Full text unavailable from EThOS. Please try the link below.
Access through Institution:
Abstract:
An adaptive test is one in which the number of test items and the order in which the items are presented are computed during the delivery of the test so as to obtain an accurate estimate of a student's knowledge, with a minimum number of test items. This thesis is concerned with the design and development of computerised adaptive tests for use within educational settings. Just as, in the same setting, intelligent tutoring systems are designed to emulate human tutors, adaptive testing systems can be designed to mimic effective informal examiners. The thesis focuses on the role of adaptive testing in student modelling, and demonstrates the practicality of constructing such tests using expert emulation. The thesis makes the case that, for small scale adaptive tests, a construction process based on the knowledge acquisition technique of expert systems is practical and economical. Several experiments in knowledge acquisition for the construction of an adaptive test are described, in particular, experiments to elicit information for the domain knowledge, the student model and the problem progression strategy. It shows how a description of a particular problem domain may be captured using traditional techniques that are supported by software development in the constraint logic extension to Prolog. It also discusses knowledge acquisition techniques for determining the sequence in which questions should be asked. A student modelling architecture called SKATE is presented. This incorporates an adaptive testing strategy called XP, which was elicted from a human expert. The strategy, XP, is evaluated using simulations of students. This approach to evaluation facilitates comparisons between approaches to testing and is potentially useful in tuning adaptive tests.
Supervisor: Not available Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.270683  DOI: Not available
Keywords: QA 76 Software, computer programming, Artificial intelligence Education Psychology
Share: