Computational systems are better at coping with rational than emotional behaviours. Partly because the former are associated with logical reasoning, while the latter are associated with automatic cognitive processes. Given that emotions play a central role in human experience, this limitation is particularly important for systems that are meant to be used by humans, such as Virtual Reality (VR) and Augmented Reality (AR) systems.
Recent technological advances in the field Human–Computer Interaction (HCI) suggest that it is possible to analyse automatically the mental states of people and interpret that information as computer commands. However, it is not clear how to analyse automatically the affective states of users, neither how to use that information as commands in VR/AR systems.
Therefore, the main purpose of this research project is understand how to build VR/AR systems that can adapt automatically to the affective states of people. This can be used to build novel tools for mental healthcare. For example, by developing video games that help to regulate the affective states of patients.
This research project is part of a collaboration with the University of Technology of Sydney (UTS).