TY - JOUR
T1 - Learning a common dictionary for subject-transfer decoding with resting calibration
AU - Morioka, Hiroshi
AU - Kanemura, Atsunori
AU - Hirayama, Jun ichiro
AU - Shikauchi, Manabu
AU - Ogawa, Takeshi
AU - Ikeda, Shigeyuki
AU - Kawanabe, Motoaki
AU - Ishii, Shin
N1 - Publisher Copyright:
© 2015 Elsevier Inc.
PY - 2015/5/1
Y1 - 2015/5/1
N2 - Brain signals measured over a series of experiments have inherent variability because of different physical and mental conditions among multiple subjects and sessions. Such variability complicates the analysis of data from multiple subjects and sessions in a consistent way, and degrades the performance of subject-transfer decoding in a brain-machine interface (BMI). To accommodate the variability in brain signals, we propose 1) a method for extracting spatial bases (or a dictionary) shared by multiple subjects, by employing a signal-processing technique of dictionary learning modified to compensate for variations between subjects and sessions, and 2) an approach to subject-transfer decoding that uses the resting-state activity of a previously unseen target subject as calibration data for compensating for variations, eliminating the need for a standard calibration based on task sessions. Applying our methodology to a dataset of electroencephalography (EEG) recordings during a selective visual-spatial attention task from multiple subjects and sessions, where the variability compensation was essential for reducing the redundancy of the dictionary, we found that the extracted common brain activities were reasonable in the light of neuroscience knowledge. The applicability to subject-transfer decoding was confirmed by improved performance over existing decoding methods. These results suggest that analyzing multisubject brain activities on common bases by the proposed method enables information sharing across subjects with low-burden resting calibration, and is effective for practical use of BMI in variable environments.
AB - Brain signals measured over a series of experiments have inherent variability because of different physical and mental conditions among multiple subjects and sessions. Such variability complicates the analysis of data from multiple subjects and sessions in a consistent way, and degrades the performance of subject-transfer decoding in a brain-machine interface (BMI). To accommodate the variability in brain signals, we propose 1) a method for extracting spatial bases (or a dictionary) shared by multiple subjects, by employing a signal-processing technique of dictionary learning modified to compensate for variations between subjects and sessions, and 2) an approach to subject-transfer decoding that uses the resting-state activity of a previously unseen target subject as calibration data for compensating for variations, eliminating the need for a standard calibration based on task sessions. Applying our methodology to a dataset of electroencephalography (EEG) recordings during a selective visual-spatial attention task from multiple subjects and sessions, where the variability compensation was essential for reducing the redundancy of the dictionary, we found that the extracted common brain activities were reasonable in the light of neuroscience knowledge. The applicability to subject-transfer decoding was confirmed by improved performance over existing decoding methods. These results suggest that analyzing multisubject brain activities on common bases by the proposed method enables information sharing across subjects with low-burden resting calibration, and is effective for practical use of BMI in variable environments.
KW - Brain-machine interface (BMI)
KW - Dictionary learning and sparse coding
KW - Electroencephalography (EEG)
KW - Multi-subject-session analysis
KW - Spatial attention
KW - Subject-transfer decoding
UR - http://www.scopus.com/inward/record.url?scp=84923279279&partnerID=8YFLogxK
U2 - 10.1016/j.neuroimage.2015.02.015
DO - 10.1016/j.neuroimage.2015.02.015
M3 - 学術論文
C2 - 25682943
AN - SCOPUS:84923279279
SN - 1053-8119
VL - 111
SP - 167
EP - 178
JO - NeuroImage
JF - NeuroImage
ER -