Can VR revolutionise our understanding of ADHD through multi-sensory information?
Traditional neuropsychological testing for the diagnosis of ADHD is primarily computer-based, requiring the patient to press correct key responses in a small room with a clinician. This begs the question: to what extent do traditional tests capture the functioning of ADHD patients in everyday life? Virtual reality (VR) serves as a solution to this dilemma, containing the potential to revolutionise the way we think about, diagnose and treat ADHD, utilising multi-sensory information in ways that have not been previously realised.
As such, Areces et al. (2016) have published a study assessing the capacity of virtual reality (VR) to distinguish between their four different subcategories of ADHD in children aged 5-16 (the combined presentation of inattention, hyperactivity and impaired impulse control, the predominantly inattentive presentation, the inattentive/restrictive presentation and the predominantly impulse/hyperactive presentation). They used the AULA Nesplora, a VR tool that immerses the patient via head-mounted display within a classroom environment, presenting them with tasks that assess attentional capacity whilst simultaneously presenting realistic “distractions” at random time points. The test contains three phases:
- getting the participant accustomed to the virtual environment by getting them to locate and pop balloons (using a mouse).
- The participant must press a button provided that they do not see the stimulus (e.g. “apple”).
- The participant must press a button whenever they see or hear the stimulus (e.g. the number “seven”).
Hence, both auditory and visual data is provided and unlike traditional tests, distractors can be present, both of which better reflect more realistic scenarios in conjunction with the classroom setting.
Areces et al. (2016) found that both in the presence and absence of distractions, those with the predominantly impulsive/hyperactive presentation obtained higher numbers of commissions and higher motor activity levels, whilst those with predominantly inattentive presentation obtained lower performance levels for omissions and response times. Interestingly, visual and auditory information in isolation enabled the distinction between combined, predominantly impulsive/hyperactive and predominantly inattentive presentations. Specifically, it was found from visual data that those with predominantly impulsive/hyperactive presentations produced significantly lower omissions compared to those with combined presentations. The auditory data suggested that the inattentive presentation performed poorly compared to the combined presentation (longer reaction times in the commission variable).
The results of this study demonstrate that through VR technologies, information from multiple sensory modalities can be used to distinguish between the subgroups of ADHD, enabling more efficient tailoring of treatments to the patient. Exploring the means by which different sensory modalities are responsible for the multiple subgroups of ADHD will be useful in uncovering the aetiology of the disorder and thus formulating more effective treatments/rehabilitation regimens to promote healthy neuropsychological functioning in children. As VR technologies develop, the potential to manipulate all senses with high precision and realism will be realised, resulting in an even greater means of neuropsychologically assessing patients and uncovering the mechanisms underlying ADHD.