The Neural Signature of Memory
The goal: Use brain activity to predict memory in VR.
People played an adaptively difficult memory game in VR and on a flat screen. They wore a Muse EEG headset while they played to capture brain activity, which we fed to a machine learning classifier. We looked at brain activity while objects were flashing (called the Encoding phase because that's when you encode stuff into memory).
Below is the experimental design for the experiment, in which participants performed this memory task in VR with a controller and on PC with a mouse (counterbalanced of course).
Below are the EEG results, averaged across all participants.
Red lines = brain activity for stuff you later forgot.
Blue lines = brain activity for stuff you later remembered.
Notice that red and blue lines are clearly different in the VR condition, but not that different in the Non-VR condition. Sure enough, we were able to predict future memory based on brain activity in encoding but this effect was much stronger in VR. In other words, brain activity has more predictive signal in VR than when looking a flat screen, even when playing the same game.
Lessons Learned: Your brain simply responds differently to 3d objects in VR than to flat representations of the same objects. I believe this speaks to the power of presence in VR.
To learn more, check out the paper we published