Please use this identifier to cite or link to this item: https://repository.isls.org//handle/1/888
Title: Toward Using Multi-Modal Learning Analytics to Support and Measure Collaboration in Co-Located Dyads
Authors: Starr, Emma L
Reilly, Joseph M
Schneider, Bertrand
Issue Date: Jul-2018
Publisher: International Society of the Learning Sciences, Inc. [ISLS].
Citation: Starr, E. L., Reilly, J. M., & Schneider, B. (2018). Toward Using Multi-Modal Learning Analytics to Support and Measure Collaboration in Co-Located Dyads. In Kay, J. and Luckin, R. (Eds.) Rethinking Learning in the Digital Age: Making the Learning Sciences Count, 13th International Conference of the Learning Sciences (ICLS) 2018, Volume 1. London, UK: International Society of the Learning Sciences.
Abstract: This paper describes an empirical study where the productive interactions of small collaborative learning groups in response to two collaboration interventions were evaluated through traditional and multi-modal data collection methods. We asked 42 pairs (N=84) of participants to program a robot to solve a series of mazes. Participants had no prior programming experience, and we used a block-based environment with pre-made functions as well as video tutorials to scaffold the activity. We explored 2 interventions to support their collaboration: a real-time visualization of their verbal contribution and a short verbal explanation of the benefits of collaboration for learning. This paper describes our experimental design, the effect of the interventions, preliminary results from the Kinect sensor, and our future plans to analyze additional sensor data. We conclude by highlighting the importance of capturing and supporting 21st century skills (i.e., collaboration and effective communication) in small groups of students.
URI: https://doi.dx.org/10.22318/cscl2018.448
https://repository.isls.org//handle/1/888
Appears in Collections:ICLS 2018

Files in This Item:
File SizeFormat 
55.pdf361.61 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.