JISC Report on E-Assessment Quality (REAQ) in UK Higher Education
Gilbert, Lester, Gale, Veronica, Wills, Gary and Warburton, Bill (2009) JISC Report on E-Assessment Quality (REAQ) in UK Higher Education.
PDF (JISC Repport on Summative E-Assessment Quality in UK Higher Education)
- Published Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.
Commissioned by the Joint Information Systems Committee (JISC) in 2008, the ‘Report on Summative e-Assessment Quality (REAQ)’ project surveyed quality assurance (QA) activities commonly undertaken in summative e-assessment by UK Higher Education (HE) practitioners and others. The project focused on what denotes high quality in summative e-assessment for the interviewees and the steps that they take to meet their own standards. An expert panel guided the project. What denotes high quality summative e-assessment Expert opinion focused, in this order of priority, on: • Psychometrics (reliability, validity), • Pedagogy (mapping to intended learning outcomes), and • Practical issues (security, accessibility). What ‘high quality’ meant to our interviewees depended on the role they played in the process of creating and using e-assessments. They listed the following matters, in this order of volume: • Using the medium to give an extra dimension to assessment, including creating e-assessments that are authentic to the skills being tested; • Issues around delivery including security, infrastructure reliability, and accessibility; • Fairness and ease of use; • Supporting academic, managerial, and organisational goals; • Addressing the intended learning outcomes; and • Validity and reliability, mainly in their ‘non-psychometric’ senses. Interviewees with the role of learning technologist (or similar roles designed to aid academics in the use of e-assessment) used these terms in their psychometric senses. Interviewees focused on the e-assessment issues that were foremost in their mind. As processes to deliver e-assessment are rarely embedded in institutions at present, interviewees described spending time and effort on practical issues ensuring that e-assessments would work effectively. Many of the quality characteristics identified by the interviewees as important in summative e-assessment are measured by psychometrics. Although some academics use these measures, the report suggests that more could benefit from using psychometric evaluation. Steps needed to produce high quality e-assessment Expert opinion focused on: • Establishing sets of steps to follow for both content and quality management; • Identifying, using, and developing relevant standards for both content and quality management; • Identifying metrics for both content and process; and • Capability maturity modelling as an encapsulation of these three essential elements of a quality management process. Interviewee comments fell under a variety of rules of thumb or suggestions for useful steps, such as: noting that the effort needed to write e-assessments, their marking Final Report, May 2009 ii schemes, and to construct feedback is front-loaded; starting with easier questions and making later questions more difficult; checking assessments with subject matter experts and high performers; identifying ‘weak’ questions and improving or eliminating them; reviewing question content to ensure syllabus coverage; getting help for academics who usually have very limited knowledge of psychometrics; attending to security; and using accessibility guidelines. In summary: • Heuristic steps for both content and quality management, and • Accessibility standards. Many interviewees assumed that e-assessments were: • Valid if they were created by the academics responsible for the course, and • Subject to the same quality assurance processes as traditional assessments as well as those required specifically for e-assessment. The report questions these assumptions. Recommendations The report makes a number of recommendations to support academics creating high quality summative e-assessments, including: • A toolkit for the end-to-end process of creating e-assessment should be developed. • A practical guide to the steps involved in creating and maintaining an e-assessment system. • Guidelines for the quality assurance of e-assessments. • Psychometric measures for assessing the quality of item banks rather than individual questions, for assessing, tracking, and reporting the quality of banked items during their lifecycle of use. • Development and extension of existing psychometric theory to include multi-staged and optional stepped constructed response questions. • Workshops and support materials to disseminate good practice in the use of psychometrics for selected response items and for questions employing constructed responses. • Workshops and support materials to disseminate good practice in question creation and meeting educational needs beyond simple selected response items, possibly subject based. • Accessibility and user interface guidelines for deploying e-assessment, in particular addressing the use of browsers. • Guidelines for the use and role of MathML for expression recognition in e-assessments. • A repository of exemplars of good practice for both selected response and constructed response questions. • JISC and other community calls for and sponsorship of e-assessment bids should consider where and how bidders should incorporate appropriate psychometric measures in their proposals. • Commercial vendors should improve the accessibility of their psychometric reports to all stakeholders, possibly simplifying them to encourage take-up of their contents.
|Item Type:||Monograph (Technical Report)|
|Divisions:||Faculty of Physical and Applied Science > Electronics and Computer Science > Electronic & Software Systems
|Date Deposited:||24 Jul 2009 07:12|
|Last Modified:||02 Mar 2012 14:04|
|Contributors:||Gilbert, Lester (Author)
Gale, Veronica (Author)
Wills, Gary (Author)
Warburton, Bill (Author)
|Contact Email Address:||firstname.lastname@example.org|
|Further Information:||Google Scholar|
|RDF:||RDF+N-Triples, RDF+N3, RDF+XML, Browse.|
Actions (login required)