Skip to content

The Impact of Physical Movement on Emotional Perception

The Impact of Physical Movement on Emotional Perception

Recent research has shown that our bodily movements can significantly influence how we perceive others’ emotions through their facial expressions. In a recent virtual reality experiment, participants were more likely to judge a face as angry when they moved away from it themselves, compared to when the face moved away from them.

The Relationship Between Movement and Emotional Perception

The findings suggest a bidirectional relationship between movement and emotion recognition, where avoidance behaviors enhance the perception of threat. These insights could help improve the design of social interaction in virtual communications and emotional artificial intelligence systems.

The study conducted by a team from Toyohashi University of Technology demonstrated that approach and avoidance behaviors in a virtual reality environment affect how individuals recognize facial expressions. Researchers observed that participants were more likely to perceive a facial expression as “angry” when they actively moved away from the facial stimulus compared to when the face moved away from them.

Experimental Procedures and Design

To demonstrate this hypothesis, the research team conducted three psychological experiments using virtual reality. Participants wore head-mounted displays and observed 3D facial models under four different approach and avoidance conditions:

Active Approach: where the participant moves towards the face.

Active Avoidance: where the participant moves away from the face.

Passive Approach: where the face moves towards the participant.

Passive Avoidance: where the face moves away from the participant.

Facial expressions were generated by blending happy and angry (or fearful) expressions across seven levels. Participants were asked to judge each expression as either “happy” or “angry” (or “happy” or “fearful”) depending on the experimental condition.

Study Results and Implications

The results of the first experiment showed that participants were more likely to recognize a facial expression as “angry” when they actively avoided the face, compared to when the face moved away from them. This suggests that personal avoidance behavior may enhance the perception of threat in others’ facial expressions.

This pattern supports the hypothesis that behavior and perception are linked in a bidirectional manner. Yugo Kobayashi, the lead author and a Ph.D. student in the Department of Computer Science and Engineering, commented, “In current communication environments such as video conferences, opportunities for physical movement are limited. These findings suggest that face-to-face communication involving physical movement may facilitate natural recognition of facial expressions.”

Conclusion

The study provides evidence that personal approach and avoidance behaviors can modify how facial expressions are recognized. Future work will focus on examining different aspects of these behaviors, such as motor intention, visual motion, or sensory feedback, which are crucial in this modulation.

These results indicate that integrating body-based perception cues could enhance emotional artificial intelligence and virtual environments, providing more realistic and empathetic social interactions.