Neural Research on Visual Fatigue in Virtual Reality

Virtual reality offers immersive depth perception experiences. Stereoscopic visual fatigue is a significant barrier currently impeding the development of virtual reality applications.  

Study: Neural Research on Depth Perception and Stereoscopic Visual Fatigue in Virtual Reality. Image Credit: leungchopan/Shutterstock.com

A recent study published in Brain Sciences investigates the fundamental neural mechanism of stereoscopic visual fatigue in virtual reality. A Go/NoGo model based on disparity variations was suggested to induce stereoscopic visual fatigue linked with depth perception. The researchers investigated the impact of disparity variations and stereoscopic visual fatigue on the temporal properties of visual evoked potentials. Disparities, repeated measurements, and point-by-point permutation statistical analysis influence the posterior visual evoked potentials according to the results of an analysis of variance (ANOVA).

Effect of Stereoscopic Visual Fatigue on Virtual Reality Applications

The development of stereoscopic displays, particularly virtual reality, exhibit applications for deep immersion.

The properties of virtual reality are drastically affected by stereoscopic visual fatigue, which is a state of weakness, easy exhaustion, and unsustainable vision of a visual-related organ. It develops after viewing stereoscopic content as a result of high load and can present as diplopia, hazy vision, and other binocular abnormalities symptoms.

Several major visual health issues, including reduced retinal vision, dry eyes, optic neurasthenia, cataracts, and glaucoma, are linked to prolonged exposure to electronics-induced visual tiredness.

Evaluating stereoscopic visual fatigue is crucial to advance stereoscopic display technology for more pleasant viewing. Vergence-accommodation conflict (VAC) and high binocular disparity contribute to stereoscopic visual fatigue.

The Efficiency of Visual Evoked Potential

Visual evoked potentials are one of the most valuable and dynamic methods for tracking the neural information stream in real-time. Visual evoked potentials are typically extracted from scalp-recorded electroencephalography (EEG) through signal averaging and linked in time with perception and specific visual sensory events.

Visual evoked potentials are predicted to be utilized as an objective measure of reaction to visual stimuli. They have therefore emerged as one of the most efficient techniques for studying human visual cognitive function in recent years.

Variations in the properties of typical visual evoked potentials components indicate functional changes in specific neural regions as they can be used as indicators to represent a particular process of visual information perception. Variations in disparity trigger visual evoked potentials.

Investigation of the Neural Mechanisms of Stereoscopic Visual Fatigue

Guo et al. developed an experimental model based on the Go/NoGo through a head-mounted display to examine the brain mechanism of stereoscopic visual fatigue. Variations in disparity with vergence–accommodation conflict led to the induction of stereoscopic visual fatigue. The Go/NoGo model was developed to keep the attention levels of respondents high throughout the stereoscopic visual fatigue experiment because it is typically used to assess the capacity of participants for sustained attention and reaction control.

Random dot stereograms (RDSs) were employed as visual stimuli to isolate the disparity variations, enabling the analysis to identify the time-domain characteristics of Visual evoked potentials. The association between disparities and the characteristics of Visual evoked potentials components as well as the relationship between stereoscopic visual fatigue and those same qualities, were investigated using point-by-point statistics and one-way repeated-measures analysis of variance (ANOVA).

Participants and Research Environment

Six males and eight females, totaling 14 healthy right-handed adults (aged 24 ±1.1 years) were chosen for the study from a group of graduate students at the Beijing Institute of Technology (Beijing, China). Participants had no degenerative, psychiatric, or neurological illnesses known to influence cognition, corrected-to-normal, or normal stereoscopic vision. Each participant signed a consent form after receiving complete information.

Participants were placed in a quiet room with excellent air conditioning and a comfortable, height-adjustable chair. The display unit used was the HTC Vive Pro head-mounted display (HMD), which has twin displays (one for each eye), each of which is a 3.5-inch AMOLED with a resolution of 1440*1600 pixels. The HMD's horizontal field of view (FOV) was 110, and the refresh rate was 90 Hz. During the experiment, participants were required to wear an EEG headgear from Compumedics NeuroScan with 64 electrodes. Instead of being worn on the participants' heads during EEG collection, the HMD was mounted on a movable mechanical arm to minimize interference.

Research Findings

In this study, Guo et al. developed a Go/NoGo paradigm for maintaining focus and created the first experimental stereoscopic visual fatigue scenario caused by disparity variations in a visual reality environment. The temporal features of visual evoked potentials induced by disparity shifts were measured using RDSs as a visual stimulus. Point-by-point statistics and one-way repeated measures ANOVA were used to investigate the association between disparities / stereoscopic visual fatigue and the properties of the visual evoked potentials component.

The researchers found that disparity variations caused posterior visual evoked potential components to be elicited. This proved that posterior visual evoked potentials originating from the precuneus could be related to depth perception, reflecting the neural response to vergence during depth perception.

The main finding of this study is that visual evoked potentials posterior have the capability to be a valuable indicator for assessing stereoscopic visual fatigue as well as an indicator of disparity variance that separates comfort from pain in visual reality content.

Reference

Guo, M., Yue, K., Hu, H., Lu, K., Han, Y., Chen, S., & Liu, Y. (2022) Neural Research on Depth Perception and Stereoscopic Visual Fatigue in Virtual Reality. Brain Sciences, 12(9), 1231. https://www.mdpi.com/2076-3425/12/9/1231

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Usman Ahmed

Written by

Usman Ahmed

Usman holds a master's degree in Material Science and Engineering from Xian Jiaotong University, China. He worked on various research projects involving Aerospace Materials, Nanocomposite coatings, Solar Cells, and Nano-technology during his studies. He has been working as a freelance Material Engineering consultant since graduating. He has also published high-quality research papers in international journals with a high impact factor. He enjoys reading books, watching movies, and playing football in his spare time.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Ahmed, Usman. (2022, September 13). Neural Research on Visual Fatigue in Virtual Reality. AZoOptics. Retrieved on April 26, 2024 from https://www.azooptics.com/News.aspx?newsID=27901.

  • MLA

    Ahmed, Usman. "Neural Research on Visual Fatigue in Virtual Reality". AZoOptics. 26 April 2024. <https://www.azooptics.com/News.aspx?newsID=27901>.

  • Chicago

    Ahmed, Usman. "Neural Research on Visual Fatigue in Virtual Reality". AZoOptics. https://www.azooptics.com/News.aspx?newsID=27901. (accessed April 26, 2024).

  • Harvard

    Ahmed, Usman. 2022. Neural Research on Visual Fatigue in Virtual Reality. AZoOptics, viewed 26 April 2024, https://www.azooptics.com/News.aspx?newsID=27901.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.