The Social Machines and Robotics labs (SMART) focuses on building socially and emotionally intelligent machines and robots that are capable of interacting and communicating in a natural way with humans, expressing through their figure a high degree of social intelligence.
Our research focuses on the analysis, modeling, and recognition of social and emotional signals in the context of human-human and human-machine interactions (computer, robot, virtual agent, etc. . . . ). This research is mainly based on computer vision, signal processing, multi-modal fusion and machine learning applied to the analysis of human behavior and the prediction of emotional and/or mental states by a machine.
Our research interests include automatic prediction of human characteristics (eg. personality), the detection of human behavior (posture, gaze, etc. . . . ), detection of the mental and emotional state of the human (eg. engagement detection, Action Units detection and continuous prediction of emotion) in the context of a Human-Agent or Human-Robot interaction. Moreover, we are interested in exploring the use of contextual information to improve machine perception as well as bias detection and prevention in intelligent machines.
Applications include, but not limited to, education and healthcare, particularly people with special needs such as those affected by Autism, Alzheimer’s, learning disabilities, psychiatric disorders, dementia, etc.