Please use this identifier to cite or link to this item:
Scopus Web of Science® Altmetric
Type: Journal article
Title: Visual appearance modulates prediction error in virtual reality
Author: Singh, A.
Chen, H.
Cheng, Y.
King, J.
Ko, L.
Gramann, K.
Lin, C.
Citation: IEEE Access, 2018; 6:24617-24624
Publisher: IEEE
Issue Date: 2018
ISSN: 2169-3536
Statement of
Avinash Kumar Singh, Hsiang-Ting Chen, Yu-Feng Cheng, Jung-Tai King, Li-Wei Ko, Klaus Gramann, and Chin-Teng Lin
Abstract: Different rendering styles induce different levels of agency and user behaviors in virtual reality environments. We applied an electroencephalogram-based approach to investigate how the rendering style of the users' hands affects behavioral and cognitive responses. To this end, we introduced prediction errors due to cognitive conflicts during a 3-D object selection task by manipulating the selection distance of the target object. The results showed that, for participants with high behavioral inhibition scores, the amplitude of the negative event-related potential at approximately 50-250 ms correlated with the realism of the virtual hands. Concurring with the uncanny valley theory, these findings suggest that the more realistic the representation of the user's hand is, the more sensitive the user becomes toward subtle errors, such as tracking inaccuracies.
Keywords: Virtual reality; cognitive conflict; prediction error; virtual hand illusion; EEG; body ownership
Rights: © 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See for more information.
DOI: 10.1109/ACCESS.2018.2832089
Grant ID:
Appears in Collections:Aurora harvest 8
Computer Science publications

Files in This Item:
There are no files associated with this item.

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.