How should we measure and show online learning?
How should we measure and show online learning?
The proliferation of Web-based objects designed for learning, makes finding and evaluating online resources a considerable hurdle for learners to overcome (Eisenberg, 2008). While established Learning Analytics methods use Web interaction data as proxies for learner engagement (Siemens et al., 2011), the adequacy of these metrics for revealing how learning is being accomplished and making it visible in a meaningful manner is uncertain (Duval, 2011). Data visualisation ‘dashboards’ can provide useful feedback mechanisms for learners and educators which can aid their evaluation of learning resources (Verbert et al., 2014), which may lead to improved discovery of content that is better suited to their needs.
This presentation explores the authors’ recent research into meaningful measures of pedagogic activity in online discussions. Learner and instructor participation in a MOOC-based comment forum was evaluated using the DiAL-e pedagogical framework (Burden & Atkinson, 2008). Results from this study indicate that measuring pedagogic activity is distinct from typical interactional measures; there are negative or insignificant correlations with established Learning Analytics methods, but strong correlations with linguistic indicators of pedagogical activity. This suggests that pedagogical and linguistic analysis, rather than interaction analysis, has the potential to automatically identify learning behaviour, and may be more useful in visualising pedagogic activity. Further, the validity of DiAL-e as a coding method has been reviewed and compared with established Computer Mediated Communication content analysis methods, including SOLO (Holmes, 2005) and Community of Inquiry (CoI – Joksimovic et al., 2014), and with Learning Analytics and Language Analysis methods. Findings suggest that different measures of cognitive activity (Dial-e’s engagement with pedagogic activities, SOLO’s increased complexity of understanding, and CoI’s development of reflective discourse) are closely correlated.
Because engagement with e-learning is different from other Web activity, establishing relevant measures of pedagogic engagement within short informal comments is important, and it is useful to develop automated analysis and visualization systems that have meaning in this field that are distinct from general news and social measures.
In addition to the authors’ work, an overview of recent research in this area will be presented including an activity and discussion to establish audience attitudes towards relevant content analysis and visualisation methods.
Burden, K. & Atkinson, S., 2008. Beyond Content: Developing Transferable Learning Designs with Digital Video Archives. Proceedings of World Conference on Educational Multimedia, Hypermedia and Telcommunications, pp.4041–4050.
Duval, E., 2011. Attention please!: Learning analytics for visualization and recommendation. LAK ’11 Proceedings of the 1st International Conference on Learning Analytics and Knowledge, pp.9–17. Available at: http://dl.acm.org/citation.cfm?id=2090118.
Eisenberg, M.B., 2008. Information literacy: Essential skills for the information age. Journal of Library & Information Technology, 28(2), pp.39–47. Available at: http://www.eric.ed.gov/ERICWebPortal/recordDetail?accno=ED491879.
Holmes, K., 2005. Analysis of asynchronous online discussion using the SOLO taxonomy. Australian Journal of Educational and Developmental Psychology, 5, pp.117–127.
Joksimovic, S. et al., 2014. Psychological characteristics in cognitive presence of communities of inquiry: A linguistic analysis of online discussions. The Internet and Higher Education, 22, pp.1–10. Available at: http://linkinghub.elsevier.com/retrieve/pii/S1096751614000189.
Siemens, G. et al., 2011. Open Learning Analytics?: an integrated & modularized platform Proposal to design , implement and evaluate an open platform to integrate heterogeneous learning analytics techniques. Knowledge Creation Diffusion Utilization.
Verbert, K. et al., 2014. Learning dashboards: An overview and future research opportunities. Personal and Ubiquitous Computing, 18, pp.1499–1514.
O'Riordan, Tim
d6ba191a-e432-41f8-b3da-176d28355579
9 September 2015
O'Riordan, Tim
d6ba191a-e432-41f8-b3da-176d28355579
O'Riordan, Tim
(2015)
How should we measure and show online learning?
Association for Learning Technology Annual Conference 2015: Shaping the Future of Learning Together, Manchester, United Kingdom.
08 - 10 Sep 2015.
Record type:
Conference or Workshop Item
(Other)
Abstract
The proliferation of Web-based objects designed for learning, makes finding and evaluating online resources a considerable hurdle for learners to overcome (Eisenberg, 2008). While established Learning Analytics methods use Web interaction data as proxies for learner engagement (Siemens et al., 2011), the adequacy of these metrics for revealing how learning is being accomplished and making it visible in a meaningful manner is uncertain (Duval, 2011). Data visualisation ‘dashboards’ can provide useful feedback mechanisms for learners and educators which can aid their evaluation of learning resources (Verbert et al., 2014), which may lead to improved discovery of content that is better suited to their needs.
This presentation explores the authors’ recent research into meaningful measures of pedagogic activity in online discussions. Learner and instructor participation in a MOOC-based comment forum was evaluated using the DiAL-e pedagogical framework (Burden & Atkinson, 2008). Results from this study indicate that measuring pedagogic activity is distinct from typical interactional measures; there are negative or insignificant correlations with established Learning Analytics methods, but strong correlations with linguistic indicators of pedagogical activity. This suggests that pedagogical and linguistic analysis, rather than interaction analysis, has the potential to automatically identify learning behaviour, and may be more useful in visualising pedagogic activity. Further, the validity of DiAL-e as a coding method has been reviewed and compared with established Computer Mediated Communication content analysis methods, including SOLO (Holmes, 2005) and Community of Inquiry (CoI – Joksimovic et al., 2014), and with Learning Analytics and Language Analysis methods. Findings suggest that different measures of cognitive activity (Dial-e’s engagement with pedagogic activities, SOLO’s increased complexity of understanding, and CoI’s development of reflective discourse) are closely correlated.
Because engagement with e-learning is different from other Web activity, establishing relevant measures of pedagogic engagement within short informal comments is important, and it is useful to develop automated analysis and visualization systems that have meaning in this field that are distinct from general news and social measures.
In addition to the authors’ work, an overview of recent research in this area will be presented including an activity and discussion to establish audience attitudes towards relevant content analysis and visualisation methods.
Burden, K. & Atkinson, S., 2008. Beyond Content: Developing Transferable Learning Designs with Digital Video Archives. Proceedings of World Conference on Educational Multimedia, Hypermedia and Telcommunications, pp.4041–4050.
Duval, E., 2011. Attention please!: Learning analytics for visualization and recommendation. LAK ’11 Proceedings of the 1st International Conference on Learning Analytics and Knowledge, pp.9–17. Available at: http://dl.acm.org/citation.cfm?id=2090118.
Eisenberg, M.B., 2008. Information literacy: Essential skills for the information age. Journal of Library & Information Technology, 28(2), pp.39–47. Available at: http://www.eric.ed.gov/ERICWebPortal/recordDetail?accno=ED491879.
Holmes, K., 2005. Analysis of asynchronous online discussion using the SOLO taxonomy. Australian Journal of Educational and Developmental Psychology, 5, pp.117–127.
Joksimovic, S. et al., 2014. Psychological characteristics in cognitive presence of communities of inquiry: A linguistic analysis of online discussions. The Internet and Higher Education, 22, pp.1–10. Available at: http://linkinghub.elsevier.com/retrieve/pii/S1096751614000189.
Siemens, G. et al., 2011. Open Learning Analytics?: an integrated & modularized platform Proposal to design , implement and evaluate an open platform to integrate heterogeneous learning analytics techniques. Knowledge Creation Diffusion Utilization.
Verbert, K. et al., 2014. Learning dashboards: An overview and future research opportunities. Personal and Ubiquitous Computing, 18, pp.1499–1514.
Slideshow
ALTC090915_6.pptx
- Other
Available under License Other.
Text
MeasuringAttentionLearningOnlineComments.pdf
- Other
Available under License Other.
More information
Published date: 9 September 2015
Venue - Dates:
Association for Learning Technology Annual Conference 2015: Shaping the Future of Learning Together, Manchester, United Kingdom, 2015-09-08 - 2015-09-10
Organisations:
Web & Internet Science
Identifiers
Local EPrints ID: 383411
URI: http://eprints.soton.ac.uk/id/eprint/383411
PURE UUID: 1f4159b4-692d-4c9d-a993-8ee7faa041f6
Catalogue record
Date deposited: 05 Nov 2015 13:14
Last modified: 14 Mar 2024 21:42
Export record
Contributors
Author:
Tim O'Riordan
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics