Joan manuel Marquès i Puig, Thanasis Daradoumis, Laura Calvet Liñán, Marta Arguedas
This work presents an automated assessment tool for online distributed programming, called DSLab. It is a web-based environment that provides a transparent deployment and execution of assignments in remote computers, a transparent initialization and the possibility to add new logs by the students. DSLab has been evaluated in a real distributed learning environment by analysing students' own perceptions on their learning improvement and exploring whether students' interactions with the tool yielded a fruitful learning experience. Current research provides no evidence that the effects of such a tool have been investigated in the field of online distributed programming. Two types of analysis were performed: a quantitative analysis of data coming from students answering a questionnaire and an analysis of the log files of students' interactions with the tool. Our results showed that students perceived substantial learning improvement from the use of the automated assessment tool. Moreover, students produced fruitful interaction with the tool as soon as they achieved high familiarization and constant activity with it, which ultimately helped them improve their academic performance. Finally, the limitations of the current study and directions for future research are presented.
© 2001-2024 Fundación Dialnet · Todos los derechos reservados