Addressing challenges in assessing Human-Computer Interaction at scale
Addressing challenges in assessing Human-Computer Interaction at scale
Human-Computer Interaction (HCI) is a research area which studies how people interact with computer systems. Because of its multidisciplinary nature, HCI modules often sit at unease within the computer science curriculum which is primarily composed by modules typically assessed through objective measures, using quantitative methods. Assessment criteria of HCI topics need to make some subjective measures quantifiable (e.g. aesthetics and creativity). In the case of large classes, it is critical that the assessment can scale appropriately without compromising on the validity of the judgment of how well the learning outcomes have been achieved.In the HCI module 'Interaction Design' at the University of Southampton, faced with increasing student numbers (from less than 80 to over 160 in two years), lecturers redesigned the assessment to provide timely feedback. The module is assessed by exam and coursework, where the exam includes a large section composed of multiple-choice questions (MCQs). In order to foster higher-order learning, students were encouraged to author MCQs using the platform PeerWise, which proved to be used as a revision aid towards the exam.In the coursework, students are required to conduct qualitative research, which in turns informs the creation of prototypes for technical solutions to problems from diverse areas of interest. Providing student such diversity of choices encourages creativity and freedom, as well as their application of the theoretical background of human-computer interaction.This presentation explains the authors' approach to assessment, both in supporting the creation of MCQs and exam revision, as well as in how the medium of video allowed for the expression of creativity and application of knowledge, whilst allowing for considerable ease of marking compared with traditional alternatives, which allowed for the provision of timely feedback to students.
Wilde, Adriana Gabriela
4f9174fe-482a-4114-8e81-79b835946224
Snow, Steve
475bccef-a436-476f-ab42-a3581be78de8
12 January 2018
Wilde, Adriana Gabriela
4f9174fe-482a-4114-8e81-79b835946224
Snow, Steve
475bccef-a436-476f-ab42-a3581be78de8
Wilde, Adriana Gabriela and Snow, Steve
(2018)
Addressing challenges in assessing Human-Computer Interaction at scale.
Computing Education Practice Conference, Durham University, Durham, United Kingdom.
11 - 12 Jan 2018.
Record type:
Conference or Workshop Item
(Paper)
Abstract
Human-Computer Interaction (HCI) is a research area which studies how people interact with computer systems. Because of its multidisciplinary nature, HCI modules often sit at unease within the computer science curriculum which is primarily composed by modules typically assessed through objective measures, using quantitative methods. Assessment criteria of HCI topics need to make some subjective measures quantifiable (e.g. aesthetics and creativity). In the case of large classes, it is critical that the assessment can scale appropriately without compromising on the validity of the judgment of how well the learning outcomes have been achieved.In the HCI module 'Interaction Design' at the University of Southampton, faced with increasing student numbers (from less than 80 to over 160 in two years), lecturers redesigned the assessment to provide timely feedback. The module is assessed by exam and coursework, where the exam includes a large section composed of multiple-choice questions (MCQs). In order to foster higher-order learning, students were encouraged to author MCQs using the platform PeerWise, which proved to be used as a revision aid towards the exam.In the coursework, students are required to conduct qualitative research, which in turns informs the creation of prototypes for technical solutions to problems from diverse areas of interest. Providing student such diversity of choices encourages creativity and freedom, as well as their application of the theoretical background of human-computer interaction.This presentation explains the authors' approach to assessment, both in supporting the creation of MCQs and exam revision, as well as in how the medium of video allowed for the expression of creativity and application of knowledge, whilst allowing for considerable ease of marking compared with traditional alternatives, which allowed for the provision of timely feedback to students.
This record has no associated files available for download.
More information
Published date: 12 January 2018
Venue - Dates:
Computing Education Practice Conference, Durham University, Durham, United Kingdom, 2018-01-11 - 2018-01-12
Identifiers
Local EPrints ID: 436179
URI: http://eprints.soton.ac.uk/id/eprint/436179
PURE UUID: a287b2b8-7beb-42df-a84f-8f5e191fa5d6
Catalogue record
Date deposited: 03 Dec 2019 17:30
Last modified: 12 Nov 2024 02:46
Export record
Contributors
Author:
Adriana Gabriela Wilde
Author:
Steve Snow
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics