A cluster of research in Human-Computer Interaction (HCI) suggests that it is possible to infer some characteristics of users’ mental states by analyzing their electrophysiological responses in real-time. However, it is unclear how to use the information extracted from electrophysiological signals to adjust the stimuli inside a virtual environment according to the user’s affective state. Therefore, this research project’s main objective is to understand how to develop a VR system that adapts automatically to the user’s affective state. A reference implementation of a neurofeedback VR experience for training affective self-regulation is proposed. This experience aims to train the ability of users to regulate their affective states voluntarily. The main contributions are (1) the development of a technique for near real-time detection of affective states in VR users; (2) a virtual environment for visual representation of affective states; and (3) the implementation of the affect detection technique and the virtual environment for the development of a neurofeedback VR experience.