The University of Southampton
University of Southampton Institutional Repository

Addressing challenges in assessing Human-Computer Interaction at scale

Addressing challenges in assessing Human-Computer Interaction at scale
Addressing challenges in assessing Human-Computer Interaction at scale
Human-Computer Interaction (HCI) is a research area which studies how people interact with computer systems. Because of its multidisciplinary nature, HCI modules often sit at unease within the computer science curriculum which is primarily composed by modules typically assessed through objective measures, using quantitative methods. Assessment criteria of HCI topics need to make some subjective measures quantifiable (e.g. aesthetics and creativity). In the case of large classes, it is critical that the assessment can scale appropriately without compromising on the validity of the judgment of how well the learning outcomes have been achieved.In the HCI module 'Interaction Design' at the University of Southampton, faced with increasing student numbers (from less than 80 to over 160 in two years), lecturers redesigned the assessment to provide timely feedback. The module is assessed by exam and coursework, where the exam includes a large section composed of multiple-choice questions (MCQs). In order to foster higher-order learning, students were encouraged to author MCQs using the platform PeerWise, which proved to be used as a revision aid towards the exam.In the coursework, students are required to conduct qualitative research, which in turns informs the creation of prototypes for technical solutions to problems from diverse areas of interest. Providing student such diversity of choices encourages creativity and freedom, as well as their application of the theoretical background of human-computer interaction.This presentation explains the authors' approach to assessment, both in supporting the creation of MCQs and exam revision, as well as in how the medium of video allowed for the expression of creativity and application of knowledge, whilst allowing for considerable ease of marking compared with traditional alternatives, which allowed for the provision of timely feedback to students.
Wilde, Adriana Gabriela
37ee0dec-a07f-4177-b291-96037fe48e14
Snow, Steve
475bccef-a436-476f-ab42-a3581be78de8
Wilde, Adriana Gabriela
37ee0dec-a07f-4177-b291-96037fe48e14
Snow, Steve
475bccef-a436-476f-ab42-a3581be78de8

Wilde, Adriana Gabriela and Snow, Steve (2018) Addressing challenges in assessing Human-Computer Interaction at scale. Computing Education Practice Conference, Durham University, United Kingdom. 11 - 12 Jan 2018.

Record type: Conference or Workshop Item (Paper)

Abstract

Human-Computer Interaction (HCI) is a research area which studies how people interact with computer systems. Because of its multidisciplinary nature, HCI modules often sit at unease within the computer science curriculum which is primarily composed by modules typically assessed through objective measures, using quantitative methods. Assessment criteria of HCI topics need to make some subjective measures quantifiable (e.g. aesthetics and creativity). In the case of large classes, it is critical that the assessment can scale appropriately without compromising on the validity of the judgment of how well the learning outcomes have been achieved.In the HCI module 'Interaction Design' at the University of Southampton, faced with increasing student numbers (from less than 80 to over 160 in two years), lecturers redesigned the assessment to provide timely feedback. The module is assessed by exam and coursework, where the exam includes a large section composed of multiple-choice questions (MCQs). In order to foster higher-order learning, students were encouraged to author MCQs using the platform PeerWise, which proved to be used as a revision aid towards the exam.In the coursework, students are required to conduct qualitative research, which in turns informs the creation of prototypes for technical solutions to problems from diverse areas of interest. Providing student such diversity of choices encourages creativity and freedom, as well as their application of the theoretical background of human-computer interaction.This presentation explains the authors' approach to assessment, both in supporting the creation of MCQs and exam revision, as well as in how the medium of video allowed for the expression of creativity and application of knowledge, whilst allowing for considerable ease of marking compared with traditional alternatives, which allowed for the provision of timely feedback to students.

Full text not available from this repository.

More information

Published date: 12 January 2018
Venue - Dates: Computing Education Practice Conference, Durham University, United Kingdom, 2018-01-11 - 2018-01-12

Identifiers

Local EPrints ID: 436179
URI: http://eprints.soton.ac.uk/id/eprint/436179
PURE UUID: a287b2b8-7beb-42df-a84f-8f5e191fa5d6
ORCID for Adriana Gabriela Wilde: ORCID iD orcid.org/0000-0002-1684-1539

Catalogue record

Date deposited: 03 Dec 2019 17:30
Last modified: 15 Aug 2020 01:43

Export record

Contributors

Author: Adriana Gabriela Wilde ORCID iD
Author: Steve Snow

University divisions

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×