Improving Sit-stand Motor Imagery Paradigm with Virtual Reality Stimulus
Improving Sit-stand Motor Imagery Paradigm with Virtual Reality Stimulus
カテゴリ: 研究会(論文単位)
論文No: MBE23037
グループ名: 【C】電子・情報・システム部門 医用・生体工学研究会
発行日: 2023/03/15
タイトル(英語): Improving Sit-stand Motor Imagery Paradigm with Virtual Reality Stimulus
著者名: Rangpong Phurin(Tokyo Institute of Technology),Connelly Akima(Tokyo Institute of Technology),Li Pengcheng(Tokyo Institute of Technology),Wilaprasitporn Theerawit(Vidyasirimedhi Institute of Science and Technology),Yagi Tohru(Tokyo Institute of Technology)
著者名(英語): Phurin Rangpong(Tokyo Institute of Technology),Akima Connelly(Tokyo Institute of Technology),Pengcheng Li(Tokyo Institute of Technology),Theerawit Wilaprasitporn(Vidyasirimedhi Institute of Science and Technology),Tohru Yagi(Tokyo Institute of Technology)
キーワード: Brain-computer interface|Motor imagery|Electroencephalography|Virtual reality|Head-mounted display|Brain-computer interface|Motor imagery|Electroencephalography|Virtual reality|Head-mounted display
要約(日本語): The decoding of motor imagery (MI) of a sit-stand motion has been applied in brain-computer interfaces and rehabilitation. It has been previously studied with a paradigm of action observation (AO), MI, and motor execution with a visual guideline. This study aims to improve the previous paradigm with a self-perspective stimulus using virtual reality head-mounted displays (VR-HMD), which has promising results in other MI tasks but is yet to be explored in this task. We experimented with comparing the new paradigm using an immersive VR-HMD stimulus against an on-screen video stimulus. Results from electroencephalography recordings of AO and MI periods show differences in the alpha and beta bands. The decoding of sit-stand intention also slightly improved in VR.
要約(英語): The decoding of motor imagery (MI) of a sit-stand motion has been applied in brain-computer interfaces and rehabilitation. It has been previously studied with a paradigm of action observation (AO), MI, and motor execution with a visual guideline. This study aims to improve the previous paradigm with a self-perspective stimulus using virtual reality head-mounted displays (VR-HMD), which has promising results in other MI tasks but is yet to be explored in this task. We experimented with comparing the new paradigm using an immersive VR-HMD stimulus against an on-screen video stimulus. Results from electroencephalography recordings of AO and MI periods show differences in the alpha and beta bands. The decoding of sit-stand intention also slightly improved in VR.
本誌掲載ページ: 115-117 p
原稿種別: 英語
PDFファイルサイズ: 1,236 Kバイト
受取状況を読み込めませんでした
