Making experiments feel real.
Our work in the Integrated Cognition and Emotion lab aims to understand how people function in their natural environment. We strive to describe human emotions as they are experienced, and influence us, in a natural way, and not just in the confined laboratory setting. The ability to ask scientific questions and determine causal relationships in naturally occurring experience and behaviour, however, remains a challenge.
To address this issue, the Integrated Cognition and Emotion lab uses high-fidelity virtual environments to control aspects of the environment that could influence emotional response, while still affording naturistic viewing and experience. To date, our focus has been primarily on the development/deployment of high-fidelity auditory virtual environments for use in MR scanning. These have allowed realistic auditory spatial perception during brain scanning. As we move forwards we are looking to extend the reach of this work as well as broaden our expertise to other sensory domains. We are currently validating the use of semi-personalized auditory virtual environments again real-world sound source localization, as well as developing a set of standardized visual emotion stimuli for presentation in immersive virtual reality (VR) systems.
As we set up at LU, we aim to expand our VR expertise, designing and working with our own multimodal virtual environment for implementation in our research programs and beyond.