Quantcast
Channel: research – Creative Arts Practice Blogs
Viewing all articles
Browse latest Browse all 10

Affective Computing: Posture Tracking and Emotion Recognition

$
0
0

Building on previous research, I found this article on posture tracking and emotion recognition, using a Kinect depth camera. This may solve the problem identified previously, eliminating the need for direct connection between user and computer through a pulse sensor  or GSR sensor, allowing for a more natural interaction.

“Intelligent User Interfaces can benefit from having knowledge on the user’s emotion. However, current implementations to detect affective states, are often constraining the user’s freedom of movement by instrumenting her with sensors. This prevents affective computing from being deployed in naturalistic and ubiquitous computing contexts. ”

“In this paper, we present a novel system called mASqUE, which uses a set of association rules to infer someone’s affective state from their body postures. This is done without any user instrumentation and using off-the-shelf and non-expensive commodity hardware: a depth camera tracks the body posture of the users and their postures are also used as an indicator of their openness. By combining the posture information with physiological sensors measurements we were able to mine a set of association rules relating postures to affective states. ”

 

 

 

posture combination

 

“An analysis of the user evaluation showed that mASqUE is suitable for deployment in ubiquitous computing environments as its rich, extensive range of emotion representations (i.e. affective states) is able to inform intelligent user interfaces about the user’s emotion. This is especially important for evaluating user experience in ubiquitous computing environments because the spontaneous affective response of the user can be determined during the process of interaction in real-time, not the outcome of verbal conversation. ”

Chiew Seng Sean Tan, Johannes Schöning, Kris Luyten, and Karin Coninx. 2013. Informing intelligent user interfaces by inferring affective states from body postures in ubiquitous computing environments. In Proceedings of the 2013 international conference on Intelligent user interfaces (IUI ’13). ACM, New York, NY, USA, 235-246. DOI=10.1145/2449396.2449427 http://doi.acm.org/10.1145/2449396.2449427

 


Viewing all articles
Browse latest Browse all 10

Latest Images

Trending Articles





Latest Images