![]() However, the brain vital signs framework combines well-established methods utilizing a rapid, integrated, and fully automated ERP stimulation sequence to elicit three targeted ERP responses. Typically ERPs are studied individually using lengthy testing times. The translation of EEG/ERP research into neurophysiological assessment applications compatible with the clinical environment has been demonstrated with rapid non-invasive implementations, such as the Halifax Consciousness Scanner (HCS D'Arcy et al., 2011) and more recently in the brain vital signs framework ( Ghosh-Hajra et al., 2016a). From EEG, a range of markers indexing information processing from low-level sensory to higher-level cognitive processing can be extracted as event-related potentials (ERPs) reflecting underlying sensory, attentional, cognitive processing ( D'Arcy et al., 2000 Gawryluk et al., 2010). In terms of deployable technologies, EEG benefits from being low-cost, non-invasive, and is particularly well-suited for clinical applications ( Connolly et al., 1995 D'Arcy et al., 2003 Gawryluk et al., 2010 Giacino et al., 2014 Sculthorpe-Petley et al., 2015 Ghosh-Hajra et al., 2016a Fickling et al., 2018). There is an increasing need for objective, neurophysiological measures, such as EEG, to provide unbiased measures of brain function across a range of different points-of-care. With both auditory and visual stimulation capabilities available, it is possible to broaden applications across the lifespan. ![]() This study provides an initial understanding of the relationship between the visual and auditory sequences, while importantly establishing a visual sequence within the brain vital signs framework. Reduced auditory N400 amplitudes compared to visual ( p = 0.0061) paired with normalization and correlation across individuals ( r = 0.6, p = 0.0012), also revealed potential systematic modality differences between reading and listening language comprehension. Auditory P300 latencies were shorter than visual ( p < 0.0001) but normalization and correlation ( r = 0.5, p = 0.0033) implied a potential systematic difference across modalities. ![]() Initial auditory-visual comparisons across the three components showed attention processing (P300) was found to be the most transferrable across modalities, with no group-level differences and correlated peak amplitudes (rho = 0.7, p = 0.0001) across individuals. Visual and auditory sequences were kept as comparable as possible to elicit the N100, P300, and N400 responses. Data were collected from 34 healthy adults (33 ± 13 years) using a 64-channel EEG system. Therefore, the objectives of this study were to: 1) demonstrate the feasibility of visual brain vital signs and 2) compare and normalize results from visual and auditory brain vital signs. Consequently, it has become important to translate brain vital signs into a visual sensory modality. However, all our applications to-date have used auditory stimulation, which have highlighted application challenges in persons with hearing impairments (e.g., aging, seniors, dementia). This framework enables access to well-established event-related potential (ERP) markers, which are specific to sensory, attention, and cognitive functions in both healthy and patient populations. ![]() ![]() The critical need for rapid objective, physiological evaluation of brain function at point-of-care has led to the emergence of brain vital signs-a framework encompassing a portable electroencephalography (EEG) and an automated, quick test protocol.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |