Examining changes in medical students’ emotion regulation in an online PBL session
Given recent attention to emotion regulation (ER) as an important factor in personal well-being and effective social communication, there is a need for detection mechanisms that accurately capture ER and facilitate adaptive responding (Calvo & D’Mello, 2010). Current approaches to determining ER are mainly limited to self-report data such as questionnaires, inventories and interviews (e.g., Davis, Griffith, Thiel, & Connelly, 2015). Although beneficial, these self-report approaches have important shortcomings such as social desirability biases, recall issues, and inability to capture unconscious ER (Scherer, 2005). The research presented here explores this gap by examining the use of multimodal observational data as well as self-report data to more accurately capture ER. Specifically, this study develops and employs a multimodal analysis of emotion data channels (facial, vocal and postural emotion data channels) to provide a rich analysis of ER in an international case study of four medical students interacting in an emotionally challenging learning session (i.e., communicating bad news to patients) in a technology-rich learning environment. The findings reported in the paper can provide insights for educators in designing programs to enhance and evaluate ER strategies of students in order to regulate personal emotions as well as the emotional needs of others in stressful situations. This work also makes important contributions to the design of technology-rich environments to embed dynamic ER detection mechanisms that enable systems to gain a more holistic view of the participants, and to adapt instructions based on their affective needs.