The University of Southampton
University of Southampton Institutional Repository

Choosing between wider participation and quality of interactions: a study of learner engagement within PeerWise

Choosing between wider participation and quality of interactions: a study of learner engagement within PeerWise
Choosing between wider participation and quality of interactions: a study of learner engagement within PeerWise
Research interest in learner engagement within peer-supported digital environments has increased in recent years. Yet the effect of lecturers actively promoting participation within such spaces on the quality of the interactions has been little explored. This poster presents one such study of learner engagement in two consecutive years of using PeerWise in a second-year module in Human-Computer Interaction (compulsory for Computer Science and Web Science programmes, optional for several others).

For the first cohort, of 141 students, the use of PeerWise was mandatory, with marks awarded subject to a minimum level of participation by two given deadlines. Over 95% of the registered students complied and engaged with the software. In contrast, for the second cohort, of 169 students, this was an optional activity, not rewarded with marks. As a consequence, only 62% of the students (107 of them) elected to take part. The latter group created collectively only 81 questions, compared to the 531 generated by the previous cohort, due to a lower number of unique authors (22 vs 126), and a slightly lower average production effort (3 questions per author vs 4.2 in the whole semester). Other metrics of cohort participation were also lower, such as much fewer comments were produced (265 vs 118) and answers to questions (8707 vs 4993).

However, both the quality and difficulty of the student-authored questions, as judged by peers, were significantly higher in the group with optional participation in PeerWise, with difficulty averages of 0.558 and 0.722, and quality averages of 2.522 and 3.037 respectively. Other metrics for engagement in this peer-supported environment were comparatively higher, in particular, those related to “conversations” sparked from questions. These metrics incorporate replies to comments added to questions, and the actors involved in the exchanges, as calculated using Wilde’s interactional model of peer-supported digital environments, based on Chua’s dialogic labels of posts in her discussion analytics’ work on FutureLearn MOOCs.

Further, and more interestingly, all of the 107 students who used PeerWise in the second year achieved exam marks of 50% or higher, of which, 99 obtained 60% or higher. A limitation of this study is that it does not control for self-selection bias, i.e. I cannot claim that the use of PeerWise caused students to succeed in the exam (it is possible that only “good” students chose to use the software). Neither I claim that the removal of the mandatory requirement of its use caused students to engage in deeper discussions. It is interesting to note, however, that the assessment design may affect student engagement even if the learning activities remain the same. This study suggests that awarding marks as an incentive for wider participation may disincentivise deeper connections. Conversely, removing this incentive for participation may foster a higher quality of content and of interactions. Still, this benefit may not be felt widely across the class: an impossible choice, worth further study and consideration within the research community and across educators promoting the use of PeerWise in their courses.

student engagement, Tools to aid computer education, interactive learning environments, PeerWise, Assessment design
Wilde, Adriana Gabriela
37ee0dec-a07f-4177-b291-96037fe48e14
Wilde, Adriana Gabriela
37ee0dec-a07f-4177-b291-96037fe48e14

Wilde, Adriana Gabriela (2020) Choosing between wider participation and quality of interactions: a study of learner engagement within PeerWise. The United Kingdom and Ireland Computing Education Research conference, virtual, United Kingdom. 03 - 04 Sep 2020.

Record type: Conference or Workshop Item (Poster)

Abstract

Research interest in learner engagement within peer-supported digital environments has increased in recent years. Yet the effect of lecturers actively promoting participation within such spaces on the quality of the interactions has been little explored. This poster presents one such study of learner engagement in two consecutive years of using PeerWise in a second-year module in Human-Computer Interaction (compulsory for Computer Science and Web Science programmes, optional for several others).

For the first cohort, of 141 students, the use of PeerWise was mandatory, with marks awarded subject to a minimum level of participation by two given deadlines. Over 95% of the registered students complied and engaged with the software. In contrast, for the second cohort, of 169 students, this was an optional activity, not rewarded with marks. As a consequence, only 62% of the students (107 of them) elected to take part. The latter group created collectively only 81 questions, compared to the 531 generated by the previous cohort, due to a lower number of unique authors (22 vs 126), and a slightly lower average production effort (3 questions per author vs 4.2 in the whole semester). Other metrics of cohort participation were also lower, such as much fewer comments were produced (265 vs 118) and answers to questions (8707 vs 4993).

However, both the quality and difficulty of the student-authored questions, as judged by peers, were significantly higher in the group with optional participation in PeerWise, with difficulty averages of 0.558 and 0.722, and quality averages of 2.522 and 3.037 respectively. Other metrics for engagement in this peer-supported environment were comparatively higher, in particular, those related to “conversations” sparked from questions. These metrics incorporate replies to comments added to questions, and the actors involved in the exchanges, as calculated using Wilde’s interactional model of peer-supported digital environments, based on Chua’s dialogic labels of posts in her discussion analytics’ work on FutureLearn MOOCs.

Further, and more interestingly, all of the 107 students who used PeerWise in the second year achieved exam marks of 50% or higher, of which, 99 obtained 60% or higher. A limitation of this study is that it does not control for self-selection bias, i.e. I cannot claim that the use of PeerWise caused students to succeed in the exam (it is possible that only “good” students chose to use the software). Neither I claim that the removal of the mandatory requirement of its use caused students to engage in deeper discussions. It is interesting to note, however, that the assessment design may affect student engagement even if the learning activities remain the same. This study suggests that awarding marks as an incentive for wider participation may disincentivise deeper connections. Conversely, removing this incentive for participation may foster a higher quality of content and of interactions. Still, this benefit may not be felt widely across the class: an impossible choice, worth further study and consideration within the research community and across educators promoting the use of PeerWise in their courses.

Video
UKICER2020-poster2
Download (25MB)

More information

Published date: 4 September 2020
Venue - Dates: The United Kingdom and Ireland Computing Education Research conference, virtual, United Kingdom, 2020-09-03 - 2020-09-04
Keywords: student engagement, Tools to aid computer education, interactive learning environments, PeerWise, Assessment design

Identifiers

Local EPrints ID: 443659
URI: http://eprints.soton.ac.uk/id/eprint/443659
PURE UUID: 17b2775a-d875-45da-8d0c-e5f3604f34c7
ORCID for Adriana Gabriela Wilde: ORCID iD orcid.org/0000-0002-1684-1539

Catalogue record

Date deposited: 07 Sep 2020 16:31
Last modified: 08 Sep 2020 01:42

Export record

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×