When Computers Decode your Social Intention
Résumé
In this demo session, we will propose our framework that is based on our paper [1] . In real time, we proposed to analyze the trajectories of the human arm to predict social intention (personal or social intention). The trajectories of different 3D markers acquired by Mocap system are defined in shape spaces of open curves, thus analyze in a Riemannian manifold. The results obtained in the experiments on a new dataset show an average recognition of about 68% for the proposed method, which is comparable with the average score produced by human evaluation. The experimental results show also that the classification rate could be used to improve social communication between human and virtual agents. To the best of our knowledge, this is the first demo in real time, which uses computer vision techniques to analyze the effect of social intention on motor action for improving the social communication between human and avatar. The main goal is to categorize the user intention among two classes denote {personal, social}. This experimentation contains 3 parts: a) data acquisitions and a learning step; b) classification; c) Kinematic analysis of the evolution of subjects to interact with the avatar. To successfully drive our study, all the using scripts are writing under Matlab and C/C++. Then the using equipments are: 1) Qualisys motion capture camera (qualisys system). The qualisys system is delivered with a desk computer with 8 GB, a processor Intel core i7-4770k (8 CPUs) at 3.5 GHz. The frequency of those cameras can varies from 100 to 500 Hz. A black glove equipped with infrared reflective markers, all those equipments are also provided by qualisys system. 2) A Matlab software (version R2014a) installed on a desk computer (qualisys system); the Qualisys system provide a specific driver that allow to couple all the Matlab scripts with their system. Thus, it is possible to command all the cameras directly from Matlab for real time analysis, see Fig. 1 .