Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-modal brain-computer interface (BCI) method and system based on synchronic compound limb imaginary movement

A brain-computer interface and synchronization technology, applied in computer components, mechanical mode conversion, user/computer interaction input/output, etc., can solve problems such as few optional categories, affecting flexibility, and not being able to satisfy multiple instruction outputs, etc. , to achieve the effect of satisfying the output of a large instruction set

Inactive Publication Date: 2017-03-15
TIANJIN UNIV
View PDF2 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the simple imaginative movement of limbs mainly involves the movements of the left hand, right hand, and foot. There are few optional categories, and it cannot meet the multi-command output, which will seriously affect the flexibility of external control of the imaginative action-based brain-computer interface system.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-modal brain-computer interface (BCI) method and system based on synchronic compound limb imaginary movement
  • Multi-modal brain-computer interface (BCI) method and system based on synchronic compound limb imaginary movement
  • Multi-modal brain-computer interface (BCI) method and system based on synchronic compound limb imaginary movement

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] A multimodal brain-computer interface method based on synchronous compound limb imaginary actions, see figure 1 , the interface method includes the following steps:

[0040] 101: Design seven types of synchronous compound limb imagination movements involving hand-foot-body multi-limb participation;

[0041] 102: Using four-period task patterns to stimulate subjects, collect EEG data, and perform preprocessing;

[0042] 103: Use three different co-space pattern algorithms to perform feature extraction and pattern recognition on the preprocessed EEG data, and obtain a single task EEG feature vector;

[0043] 104: Input the single-task EEG feature vector into the support vector machine to train the classifier, and then predict the spatial features from the test set.

[0044] Wherein, the first type of synchronous compound limb imagining action in step 101 is: coordinated movement of both hands; the second type of synchronous compound limb imagining action is: left hand a...

Embodiment 2

[0065] Below in conjunction with specific accompanying drawing, calculation formula, the scheme in embodiment 1 is introduced in detail, see the following description for details:

[0066] 201: Design three types of synchronous compound limb imagination movements involving hand-foot-body multi-limb participation;

[0067] The embodiment of the present invention designs three types of synchronous compound limb imaginative actions involving hand-foot-body multi-limb participation, which are respectively the first type: the coordinated movement of both hands; the second type: the coordinated movement of the left hand and the opposite lower limb, and the third Class: coordinated movement of right hand and contralateral lower limb; three simple imaginary movements of limbs, respectively Class IV: left hand; Class V: right hand; Class VI: foot movement; and Class VII: resting state.

[0068] 202: Experimental Paradigm;

[0069] During the experiment, the subjects sat quietly on a c...

Embodiment 3

[0116] Below in conjunction with concrete test data, the scheme in embodiment 1 and 2 is done feasibility verification, see the following description for details:

[0117] Table 1 shows the classification accuracy of ten subjects under three multi-category CSP algorithms for seven types of task patterns. It can be seen that the correct rate of the seven classifications of the second subject is the highest, reaching more than 80% under the three CSP algorithms, and 84.11% under the Multi-sTRCSP. Among them, the third subject performed the worst, with a correct rate of about 63%. Through the average accuracy rate of all subjects, it can be found that the classification accuracy rates of Multi-sTRCSP and Multi-CSP are both about 70%, and Multi-sTRCSP is slightly better than Multi-CSP, while Multi-GECSP performs the worst.

[0118] The above results show that the improved multi-classification CSP algorithm based on the binary classification CSP algorithm can be applied to the fea...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-modal brain-computer interface (BCI) method and system based on synchronic compound limb imaginary movements. The method comprises the following steps: designing seven types of multi-limb participated synchronic compound limb imaginary movements related to hands, feet and a body; irritating a subject by adopting task models of four stages, collecting electroencephalogram data and carrying out preprocessing; carrying out feature extraction and mode recognition on the preprocessed electroencephalogram data by three different cospace mode algorithms, so as to acquire an electroencephalogram feature vector of a single task; inputting the electroencephalogram feature vector of the single task into a training classifier of a support vector machine, and then predicting a spatial feature from a test set. The invention establishes a novel multi-modal BIC pattern based on the synchronic compound limb imaginary movements, the dilemmas that the species of the existing imaginary movement patterns are limited and disaccord with actual movements are got rid of, big instruction set output of information controlled by an MI (Motor imagery)-BCI system is realized, furthermore, a novel pathway is explored and a novel method is provided for promoting the practical application of a brain-computer interface in rehabilitation engineering.

Description

technical field [0001] The present invention relates to the field of brain-computer interface, in particular to a multi-modal brain-computer interface method and system based on synchronous compound limb imaginative actions. Background technique [0002] Motor imagery (MI) is one of the classic paradigms of Brain-computer interface (BCI), but it has few optional categories and cannot satisfy the large instruction set output. Motor imagery, that is, there is only motor intention but no actual action output. As the only brain-computer interface paradigm that does not require external stimuli and directly reflects the user's subjective movement awareness, imaginary action is most in line with the imaginative action of normal brain thinking activities, and it is not easy to make users feel tired. The brain-computer interface system based on imaginative actions can truly "turn thoughts into actions", which is the best way to build patients' confidence in returning to normal life...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01
CPCG06F3/015G06F2203/011
Inventor 明东奕伟波邱爽綦宏志何峰
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products