RoboLang BIOS 2020
Pretext
In the recent years, social robots have been introduced in a wide variety of educational contexts, for instance to aid students in foreign language (L2) learning. Social robots offer the possibility for multimodal communication, including extralinguistic as well as paralinguistic properties. However, relatively little amount of research has been done into the emotional reactions of humans interacting with social robots in educational situations. Thus, a study examining these emotional reactions should be conducted.
Goal of the project
The goal of our project was to conduct such a study, in which we focus on how a social robot’s multimodal communication could affect proficient adult L2 speakers. We were to compare two conversational situations where the robot would’ve either been 1) encouraging after the participant’s appropriate answer or 2) doubtful after the participant’s appropriate answer. Data from the participants’ emotion related reactions were to be collected and analyzed in both types of situations. The originality of our study lied in the combination of different data collection methods to study emotion related reactions in human robot interaction (HRI) with adults. The study would’ve explored these emotional reactions by 1) measuring electrodermal activity with wearable bracelets, 2) measuring pupillometric reactions with a wearable eye-tracker, and 3) filming the participants’ facial expressions with video cameras.
Results of the project
Unfortunately, due to the spread of COVID-19 during 2020, our team could not perform the actual study and thus we were unable to collect any data. As such, the project scope was redefined to focus more on creating the framework for the study to be conducted at a later date.
In a nutshell, our project’s results were:
- An easy to follow theoretical framework for the actual study.
- Dialogues and behaviors for the social robot programming software.
- A simple to use pipeline for each subjects’ data.
- Revised instructions for the devices used to gather the emotional reactions of subjects.
- Code for data analysis for each subject that is easily scalable for the main study.
- Working speech-to-text and emotion recognition.
- Created the template for the subject survey information to be stored on the University of Turku’s RedCap -database.