We have recently developed a prototype of a novel human-computer interface for assistive communication based on voluntary shifts of attention (gaze) from a far target to a near target associated with a decrease of pupil size (Pupillary Accommodative Response, PAR), an automatic vegetative response that can be easily recorded. We report here an extension of that approach based on pupillary and cortical frequency tagging.

Objective. We have recently developed a prototype of a novel human-computer interface for assistive communication based on voluntary shifts of attention (gaze) from a far target to a near target associated with a decrease of pupil size (Pupillary Accommodative Response, PAR), an automatic vegetative response that can be easily recorded. We report here an extension of that approach based on pupillary and cortical frequency tagging. Approach. In 18 healthy volunteers, we investigated the possibility of decoding attention shifts in depth by exploiting the evoked oscillatory responses of the pupil (Pupillary Oscillatory Response, POR, recorded through a low-cost device) and visual cortex (Steady-State Visual Evoked Potentials, SSVEP, recorded from 4 scalp electrodes). With a simple binary communication protocol (focusing on a far target meaning 'No', focusing on the near target meaning 'Yes'), we aimed at discriminating when observer's overt attention (gaze) shifted from the far to the near target, which were flickering at different frequencies. Main results. By applying a binary linear classifier (Support Vector Machine, SVM, with leave-one-out cross validation) to POR and SSVEP signals, we found that, with only twenty trials and no subjects' behavioural training, the offline median decoding accuracy was 75% and 80% with POR and SSVEP signals, respectively. When the two signals were combined together, accuracy reached 83%. The number of observers for whom accuracy was higher than 70% was 11/18, 12/18 and 14/18 with POR, SVVEP and combined features, respectively. A signal detection analysis confirmed these results. Significance. The present findings suggest that exploiting frequency tagging with pupillary or cortical responses during an attention shift in the depth plane, either separately or combined together, is a promising approach to realize a device for communicating with Complete Locked-In Syndrome (CLIS) patients when oculomotor control is unreliable and traditional assistive communication, even based on PAR, is unsuccessful.

Decoding overt shifts of attention in depth through pupillary and cortical frequency tagging / De'Sperati, Claudio; Roatta, Silvestro; Zovetti, Niccolò; Baroni, Tatiana. - In: JOURNAL OF NEURAL ENGINEERING. - ISSN 1741-2560. - 18:3(2021), p. 036008. [10.1088/1741-2552/ab8e8f]

Decoding overt shifts of attention in depth through pupillary and cortical frequency tagging

de'Sperati, Claudio;
2021-01-01

Abstract

Objective. We have recently developed a prototype of a novel human-computer interface for assistive communication based on voluntary shifts of attention (gaze) from a far target to a near target associated with a decrease of pupil size (Pupillary Accommodative Response, PAR), an automatic vegetative response that can be easily recorded. We report here an extension of that approach based on pupillary and cortical frequency tagging. Approach. In 18 healthy volunteers, we investigated the possibility of decoding attention shifts in depth by exploiting the evoked oscillatory responses of the pupil (Pupillary Oscillatory Response, POR, recorded through a low-cost device) and visual cortex (Steady-State Visual Evoked Potentials, SSVEP, recorded from 4 scalp electrodes). With a simple binary communication protocol (focusing on a far target meaning 'No', focusing on the near target meaning 'Yes'), we aimed at discriminating when observer's overt attention (gaze) shifted from the far to the near target, which were flickering at different frequencies. Main results. By applying a binary linear classifier (Support Vector Machine, SVM, with leave-one-out cross validation) to POR and SSVEP signals, we found that, with only twenty trials and no subjects' behavioural training, the offline median decoding accuracy was 75% and 80% with POR and SSVEP signals, respectively. When the two signals were combined together, accuracy reached 83%. The number of observers for whom accuracy was higher than 70% was 11/18, 12/18 and 14/18 with POR, SVVEP and combined features, respectively. A signal detection analysis confirmed these results. Significance. The present findings suggest that exploiting frequency tagging with pupillary or cortical responses during an attention shift in the depth plane, either separately or combined together, is a promising approach to realize a device for communicating with Complete Locked-In Syndrome (CLIS) patients when oculomotor control is unreliable and traditional assistive communication, even based on PAR, is unsuccessful.
2021
We have recently developed a prototype of a novel human-computer interface for assistive communication based on voluntary shifts of attention (gaze) from a far target to a near target associated with a decrease of pupil size (Pupillary Accommodative Response, PAR), an automatic vegetative response that can be easily recorded. We report here an extension of that approach based on pupillary and cortical frequency tagging.
Attention shifts in depth
Brain-computer interface
Frequency tagging
Locked-in syndrome
Pupil oscillations
Steady-state visual evoked potentials
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11768/100669
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? 3
social impact