Use this URL to cite or link to this record in EThOS:
Title: Comparability of science assessment across languages : the case of PISA science 2006
Author: El Masri, Yasmine Hachem
ISNI:       0000 0004 6061 8320
Awarding Body: University of Oxford
Current Institution: University of Oxford
Date of Award: 2015
Availability of Full Text:
Access from EThOS:
Full text unavailable from EThOS. Please try the link below.
Access from Institution:
In this research, I investigated the extent to which language versions (English, French and Arabic) of the same science test were comparable in terms of item difficulty and demands. I used PISA science 2006 data from three countries (respectively, UK, France and Jordan). I argued that language was an intrinsic part of the scientific literacy construct, be it intended or not by the examiner. The tight relationship between the language element and the scientific knowledge makes the language variable inextricable from the construct. This argument has considerable implications on methodologies used to address this question. I also argued that none of the available statistical or qualitative techniques were capable of teasing out the language variable and answering the research question. In this thesis, I adopted a critical evaluation and empirical methods, using literature from various fields (cognitive linguistics, psychology, measurement and science education) to analyse the test development and design procedures. In addition, I illustrated my claims with evidence from the technical reports and examples of released items. I adopted the same class of models employed in PISA, the Rasch model, as well as differential item functioning (DIF) techniques to address my question empirically. General tests of fit suggested an overall good fit of the data to the model with eleven items out of 103 showing strong evidence of misfit. Various violations to the requirements of the Rasch model were highlighted. The DIF analysis indicated that 22% of the items showed bias in the selected countries, but bias balanced out at test level. Limitations of the DIF analysis to identify the source of bias were discussed. Qualitative approaches to investigating question demands were examined and issues with their usefulness in international settings were discussed. A way forward incorporating cognitive load theory and computational linguistics is proposed.
Supervisor: Baird, Jo-Anne ; McNichol, Jane Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID:  DOI: Not available
Keywords: Science--Study and teaching--Evaluation ; Educational evaluation--Case studies ; Comparative education