HCI Students Conference

WHEN

27. June 2022

09:00

TOPIC
Research Projects 2022
WHERE

FB Computerwissenschaften - Hörsaal I - Christian Doppler

HCI Students Virtual Summer 2022

The Conference will take place on June 27th, 2022 in Hörsaal I - Christian Doppler.

The Proceedings of the Salzburg HCI Students Conference 2022 are presented here:

Proceedings of the Conference

 

 

SPEAKERS
Dominik Hofer, Daria Kolosovskaia, Zeynep Karakaya
environment adaption to human emotion, emphatic robots, human-building interaction

environment adaption to human emotion, emphatic robots, human-building interaction

Brigitte Schampf, Christina Fuchs, Barat Venkitachalam Krishna
Studying the Effects of Unanticipated Multimodal Sensory Stimuli on the Perceived Feeling of Empathy in Virtual Reality Experiences

Virtual Reality and immersive experiences are becoming popular among people, not only for typical use cases like gaming, but also for social interactions such as events, meetings, or conversations. These relatively new applications of Virtual Reality try to bridge the gap between the physical and digital worlds by giving the user a completely immersive experience. In this paper, we present the results of a study that we conducted to examine the effects of unanticipated sensory stimuli (visual, auditory, olfactory, tactile, gustatory, and thermoception) on the perceived feeling of empathy by the user during a Virtual Reality experience. The study also explores the feelings of the participant when experiencing unanticipated sensory stimuli. Previous studies have found that different types of Virtual Reality experiences can enhance emotional empathy or arouse compassionate feelings but do not necessarily encourage the users to imagine the perspectives formed by other people. In this study, the participants are taken through a Virtual Reality experience that incorporates interactive storytelling and transports the user through the five different emotions ie. happiness, fear, anger, sadness, and relief. The participant interacts with a storyteller within the Virtual Reality space, who narrates an emotional story. During the course of narration, the user is subjected to sensory input to study the effect of unanticipated sensory stimuli. After the story, the user is then interviewed based on a specific set of questions to understand the details of their experience. The results of this study would help us understand the perceived feeling of empathy in users within multi-sensory Virtual Reality experiences and the element of surprise, to improve and develop engaging Virtual Reality experiences. 

Corey Tran, Hugar Sagar, Marlene Stadlmann
An Integrated Approach To Increasing Empathy Towards Patients Through Embodied Emotions

Empathy is one of the core principles in practicing medicine. However, levels of empathy from training on medical mannequins may not translate to medical practice on real people. The objective of this research is to understand whether video simulations of faces on a mannequin can generate more empathy during nursing student training, as opposed to training on regular silicon mannequins with immutable faces. By using two versions of training mannequins with and without simulated emotions respectively, differences in empathy levels during nursing interaction with mannequins were measured with the Jefferson Scale of Empathy (JSE) standardized 20-question self-reported questionnaire and then followed up with a semi-structured interview. This study analyzes how emotions affect how empathetic nursing students in training are towards their ‘patients’.

Carina Klaussner, Oliver Claasen, Marc Beiwinkler
Comparison of evoked empathy for three-dimensional virtual reality character with real facial expressions and two-dimensional human stimulus

Virtual reality experiences can enhance a higher level of empathy for the shown characters. This depends on the viewers cognitive processes influenced by quality, presence and flow of the shown scenario. In this study we increased these factors by using real facial expressions for a character in virtual reality to investigate whether realistic VR-characters can evoke the same level of empathy than a conventional video call does. In a laboratory study (N=24), participants viewed a young female student consulting the student pastoral care because of perosnal issues. The participants were devided into watching either a metahuman in a virtual reality format orr a real human in a two-dimensional format. Results show that both conditions evoked the same level of empathy among the viewers. The virtual reality format showed less variance among the empathy levels. This indicates that VR characters with real facial expressions increase the utility of virtual reality in the realm of interpersonal functioning which can result in a wide range of application areas such as virtual events, business meetings, gaming, healthcare and many more.

Mats Reckzügel, Mieke Fimpel, Nathaniel Flach,
“Hey, can we talk?”: Exploring How Revealing Implicit Emotional Responses Tangibly can Foster Empathy During Mobile Texting

Emotions are complex. In traditional face-to-face conversations, people can often understand the explicit and implicit emotions and emotional capacity shared by their counterpart in body-language, word usage, and tone. It’s extremely difficult to identify these verbal and non-verbal responses when using mobile devices to communicate. However, for clarity and empathy, it would be useful to recognize emotional reactions especially after ambiguous conversation starters such as “hey, can we talk?” are sent. In this paper, we explore how a tangible artefact could enhance understanding and empathy in text-based communication by sharing implicit emotional feedback. We are proposing Tange: a phone case that senses and sends the sender of a text message the initial implicit emotional reaction of the receiver, thereby explicitly revealing to the sender how best they can proceed with the subsequent enriched conversation. To design this artefact we first conducted a workshop to understand what types of emotions and bodily reactions digital natives felt in the context of conversation starters that vary in sender type. We then hosted a second workshop where a group of HCI practitioners created several design directions based on the insights given in the first workshop. After prototyping Tange on the cumulation of the two workshops, we conducted usability testing on the artefact to see if a sender could accurately understand the implicit emotion shared by a receiver based on level of arousal. This paper concludes by discussing the nuances of designing for emotions in an online context, as well as suggestions for future designs.

Alicia Fradera, Anika Watzka, Luca Frantzmann
Feel the fatigue: A first-person approach to enhance empathy towards people with fatigue using a wearable

The presented study explores how empathy and prosocial behavior for people affected by the invisible symptom of fatigue can be enhanced through the use of a wearable. The wearable mimics the symptom of fatigue to a significant extent and is therefore used for a first-person approach to qualitatively explore empathy. 

The study consists of three parts: 1) Interviews with affected persons (n = 5) to gather in-depth insights about the symptom. 2) Ideating and prototyping a wearable in consultation with affected persons. 3) A study with non-affected persons (n = 6) experiencing a day of a person with fatigue. For the latter main study, participants were asked to think out loud and describe their experiences and thoughts while completing different everyday tasks. 

The described bodily as well as mental experience matches to a great extent the described experience of actually affected persons. We therefore conclude that empathy is enhanced. We also conclude that while prosocial behavior is not necessarily enhanced, awareness for different ways to behave prosocially towards people affected by fatigue might be strengthened. However, the definition and practice of prosocial behavior might be subjective.

Cansu Demir, Isabel Melibeu, Sonja Lang
AutoMate. A first-aid communication system to reduce the bystander effect in car accidents

This paper focuses on breaking the bystander effect in the context of a car accident, and how a first-aid communication system (AutoMate) can improve the empathy from the bystanders towards the victim. Conducting a user study on a car simulator, we found positive reactions from eight users towards AutoMate, leading to a more empathic approach from them to the victim. We investigated how AutoMate can help to reduce the bystander effect in car accidents.We found AutoMate to be a useful tool as an integrated add-on in existing car systems. The results show that AutoMate’s guidance displayed high levels of clarity and easy to use interaction that lead to a more reassured driver, having a more positive impact on their empathy levels.

Ramona Hauser, Philipp Maurer, Richard Parayno
Enhancing Empathy through Vibrotactile Stimulation of the Torso

In recent years, novel interaction approaches in the realm of haptic technology have been used to induce emotions in various use cases. Specifically, vibrotactile stimulation has been observed to cause more immersive experiences in media. However, understanding whether this effect may transfer to the concept of empathy between users has not yet been explored. This paper presents a concept for a vibrotactile vest which users can wear to “feel” the emotions of their communication partner on their skin. We used a commercially available vest to design emotional vibration patterns which were then tested in an experimental study (n=31). Our results show that the vest partially enhanced empathy: It had no significant effect on enhancing cognitive empathy (understanding of someone else’s emotions) but had a significant effect (p<0.05) on affective empathy (feeling someone else’s emotions). Also, this effect was only seen for high-arousal emotions such as anger (d=1.19) and happiness (d=0.83), but not for low-arousal emotions such as sadness. Lastly, we contribute design recommendations to broaden the scope of emotions conveyed with the help of haptic technologies. 

Time Schedule
27. Juni
09:00 - 09:15
Conference Opening
09:15 - 10:00
Session 1
Short Intro of the Speakers
10:00 - 11:00
Session 2
Short Intro of the Speakers
11:00 - 12:00
Coffee Break
Grab your own coffee!
12:00 - 13:00
Session 3
Short Intro of the Speakers
13:00 - 14:00
Session 4
Short Intro of the Speakers
14:00 - 14:15
Conference & Semester Closing
Partner
Team
Marc Beiwinkler
Student
Dominik Hofer
Student HCI
Daria Kolosovskaia
Student HCI
Zeynep Karakaya
Student HCI
Brigitte Schampf
Student HCI
Christina Fuchs
Student HCI
Barat Venkitachalam Krishna
Student HCI
Corey Tran
Student HCI
Hugar Sagar
Student HCI
Marlene Stadlmann
Studen HCI
Carina Klaussner
Student HCI
Oliver Claasen
Student HCI
Mats Reckzügel
Student HCI
Mieke Fimpel
Student HCI
Nathaniel Flach
Student HCI
Alica Fradera
Student HCI
Anika Watzka
Student HCI
Luca Frantzmann
Student HCI
Cansu Demir
Student HCI
Isabel Melibeu
Student HCI
Sonja Lang
Student HCI
Ramona Hauser
Student HCI
Philipp Maurer
Student HCI
Richard Parayno
Student HCI