Visualize Emotions in Virtual Reality

Designing biosignals-driven visualization to facilitate understanding and connection between online collaborators.

VR, Interaction Design, Affective Computing

Visualize Emotions in VR

OverviewBackground ResearchVariety of RepresentationEmotion VisualizationDesign IterationsEvaluating the DesignReflections[BACK TO TOP]

Challenge

In a virtual reality chat environment, people often lack the non-verbal cues to tell how others feel. The muffled voice can add further difficulty to this this task. The simple fix to this problem is to allow users to change their avatar’s facial expressions to match their own internal states. However, there are two potential problems associated with this solution. First, lacking the proper trigger, users may not know when to change the display of their emotions. Secondly, while the paradigmatic change of avatar’s facial expressions may convey one’s emotions, it can hardly elicit empathy on others.

Solution

To solve the first problem, what if we can automate the process of changing visualizations based on people’s biosignal fluctuations? To solve the second problem, what if we can make the visualization of one’s emotions more interactive, and potentially more immersive? For this project, I designed and implemented five prototypes to explore how we can integrate biosignals into virtual reality environment for conveying information about our internal states during online collaboration.

Tools:  Sketch, Adobe Illustrator, Adobe After Effects, Unity
Duration: Jan. 2018 - Present (Ongoing)

Background Research

Competitive analysis has revealed majority of existing products use graph, light, imagery cue or physical motion to reflect the change of biosignals on users.

To get inspirations on how to visualize biosignal data, I examined both digital and physical products. I mapped the representation of existing products from the most distinct to the most inferred, and identified four common techniques that are used to visualize biosignal data. While the graph is the most direct way to demonstrate the ups and downs of one’s biosignal change, it can be distracting and too rigid in the context of online chat. In contrast, lights are more flexible but can also be ambiguous in their meaning. For both imagery cues and physical motions, they are relatively under-explored. This yields to both sense of novelty and potential room for misinterpretation. Understanding the pros and cons of each type of visualization helped me weave different visualization techniques together to create a fun and non-ambiguous experience in the final design.

Academic research on human emotions lays the foundation for what to animate in my prototypes.

The circumplex model of emotion, developed by the well-knwon psychologist James Russell, describes how emotions can be distributed in a two-dimensional circular space. In this model, emotional states can be represented by any level of valence and arousal, or at a neutral level of one or both of these factors. Following this model, in each prototype, I designed five different animations to represent four quadrics and the neutral state. In addition, while the arousal will be inferred using biosignal, users will be prompted to choose their valence value, positive or negative, once the change of arousal is detected.

Variety of Representation

To explore how different representations might influence people’s perception of emotions, I designed and developed five prototypes using Unity.

Emotion Visualization

To visualize emotions in object and environment, I identified three key components through design iterations: color, motion and environmental effect. Click the table on the right to see how I have changed these components in different prototypes.

Color

Colors have an established history to be associated with emotions. For instance, in the recent Disney movie, Inside Out, different characters are designed to be in different colors. Joy is yellow. Anger is red. Fear is purple. Inspired by this movie, I adapted different colors to reflect people’s change of emotions in various prototypes. The most representative one is the fairy. When the person feels neutral or excited, the fairy is shown as yellow. When the person feels sad, the fairy becomes black and falls onto the floor. When the person feels anxious, the fairy is shown as red. Lastly, when the person feels calm, the fairy becomes blue.

The color of fairy changes depending on the user’s biosignal fluctuation. 

Motion

The problem with using colors alone is that while it is extremely useful on abstract representations, changing colors on concrete objects can be confusing. Thus, for the prototypes that contain concrete objects, I focused on using motion to express one’s emotions.

In a neutral state, the birds flies up and down on both side of avatar.
When user gets anxious, the birds will fly around the avatar’s head.
When the user gets sad, the birds drop on the floor, motionless.
When the user becomes excited, one of the birds will fly in a joyful manner and the other will jump around in the original position.

Environmental effect

One of the advantages that virtual reality has over the traditional communication platforms is its immersive experience. While both color and motion can be manipulated in the existing communication platforms, the environmental effects that are triggered by special events will provide a delightful and unique experience in virtual reality. By using the environmental effects, I want to enable users not only to see the other person’s emotion, but to feel it and grow empathy and connection with each other.

In the spaceship prototype, when the other person gets excited, the user will be surrounded by the blue twinkle stars emerging from the space.
When the other person gets sad, the user will also be soaked in pouring rain.

Design Iterations

Starting with AR, I explored a variety of single-dimensional elements to facilitate emotion expression.

Initially, I tried to implement this project in the newly released Facebook AR studio, thinking that it would be a more accessible platform for people to use. I created various prototypes specifically for AR environment. As facial recognition technology is a unique feature in AR, some of my prototypes were created around the idea of attaching new elements to human body to facilitate emotion expression. Additionally, since people can still see the real world, I didn’t need to construct an entire new space in my prototypes.

Many of my initial AR prototype designs were single dimensional and centered around human face.

Transitioning from AR to VR, I provided users with more context information to establish a convincing narrative. 

Two weeks into the project, I realized that there was no way to import biosignal data into Facebook AR studio. Transitioning from AR to VR presented some new design challenges to me. First, since users will be isolated from the real world, I need to add environmental cues for users to understand where they are and potentially what their relationship is with the other avatar. Also, users should feel it is intelligible for themselves, the other avatar and the additional elements for biosignal visualization to co-exist in that space.

When switching from AR to VR, I paid more attention to establish contexts.

The evolution of the spaceship prototype demonstrates how visual cues can help construct a narrative. In the initial AR prototype, the user only sees a spaceship floating on top of the real world. Even though the yellow stars may nudge users to imagine a different context than where they are situated, the stars are still more entertaining than being functional in terms of storytelling. When first moving to VR, I decided to add background so that users know they are being relocated to the outer space. In the final prototype, I further added a dashboard and window tint. Even though these contextual elements are not directly related to emotion expression, having them in the setting creates a backdrop story of why users are where they are and help build the sense of presence and immersion.

The evolution of the spaceship prototype

Considering ergonomics in VR to reduce user fatigue.

It is known that many people can get dizzy after being in VR for a while. The user fatigue can be exacerbated if people are required to turn their heads around constantly. Thus, I tried to include all the major visualization elements in people's binocular field of view to reduce their head movement.

In the initial design of floating shapes, cubes were spread out and required users to turn around to see everything. I later reduced the distance between cubes and avatar.

Utilizing visual cues to strengthen the relationship between avatar and biosignal visualization elements.

When it is not the avatar that directly expresses emotions, establishing the connection between avatar and the visualization elements becomes very important for viewers to understand the change of emotions and to build empathy. In the floating shapes prototype, without the proper visual cues, users constantly wondered the relationship between the astronaut avatar and the cubes - "are the cubes attacking or protecting the avatar?". To solve this problem, I added a stand for the avatar, that changes color along with the cubes. As a result, users perceived more connection between the avatar and the cubes.

Adding a stand that changes color along with the cubes, as shown on the right, made users perceive a closer connection between avatar and biosignal visualization elements.

Fine tuning the details to construct engaging experiences.

One thing that stood out to me in the user testing was how much attention people paid to the details. When they were immersed in a new environment, they were so curious and didn't want to miss anything. In the fairy prototype, I have taken a step further beyond its functionality and attached some trails to add more playful personality to the fairy. People indeed noticed the trails and felt really excited about it.

Adding trails behind the fairy gives it a more playful personality.

Finding a balance point for dynamic animations to be empathy-evoking but not disturbing.

In the beginning, I was laser-focused on how to accurately visualize emotions. However, since the overarching goal is to foster understanding and connection, accurate visualization is not enough. It needs to be inviting even when one person is undergoing negative emotions. For the floating shapes prototypes, initially I used four shaky cubes to represent anxiety. Even though they portrayed the emotion, they caused viewers to feel panic and unwelcome to continue the conversation. In the final design, I changed the visualization to four red boxes with tentacles stretching out. People could still see the anxious emotion, but found this visualization more intriguing and were even more inclined to talk about what they saw with their partners.

Comparing to the black shaky boxes on the left, the red boxes with tentacles on the right makes users feel more welcome to continue their conversation and openly discuss about the visualization.

Evaluating the Design

I tested out the prototypes with people in pairs.  For reach pair, one person would wear the Oculus headset and the other wore the Empatic wristband which measured his/her heart rate. They were asked to have a casual conversation for 5 minutes and then switch the roles. Each pair tried out two different prototypes, and they were asked specific questions regarding their perception of the other person, their behavior changes, and their impression of the visualizations afterwards. Here are some interesting insights:

For participants who are not competent in telling other’s emotions in daily lives, they found the visualization extremely helpful in telling how the other person felt.

Participants felt more comfortable with having abstract rather than concrete objects represent their emotions.

Participants enjoyed more immersive prototypes as some of them got fatigued quickly when focusing on or tracking one object for too long.

Sometimes, participants were not aware of how much their heart rates have changed. The change of visualization in that case can lead to the binary reactions: 1) distrust the sensor 2) excited to discover something new about themselves.

For participants who were close to each other, they tended to talk more about change of visualization and each other’s emotion in conversations. However, for participants who were not familiar with other, they responded by changing the conversation topic to avoid contention.

There were still room for miscommunication and misinterpretation. When not openly discussing the change of visualization, people sometimes misattributed the reason why the change happened.

The user testing has revealed several areas for improvements. First, as many has brought up, emotions are dots on a spectrum rather than discrete categories. Ensuring smooth transitions between emotions will provide a more seamless experience and help the other person to respond quickly. Second, although the current prototypes were proved to help friends more openly communicate how they feel, most of the prototypes were hardly successful in bringing people who are not familiar with each other closer. Further testing needs to be done to explore if having both people's biosignal visualization side by side can solve this problem. Another possible solution is to make the visualization more interactive so that people will be encouraged to talk about what they see and why the change happens.

As for the next steps, I will be working with a PhD student this summer to design and launch a more rigorous experiment to test out how biosignal visualization will affect people's behavior and their perception of others.

Reflections

In VR, space is the story.

Designing for VR is completely different from my past experience of designing for screens. It provides almost infinite possibilities for storytelling and interaction. As a designer, I learned to use the whole space, including background, environmental effects and single objects to construct a novel experience for users. I was mesmerized by this potential of empowering users not only to see things that are invisible in their daily lives, but also to feel them. I'm very excited to work more with the medium of AR/VR in the future.