Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Brain intention identification method and system based on brain-computer interface

A technology of brain-computer interface and recognition method, which is applied in the fields of medical science, sensors, diagnostic recording/measurement, etc., and can solve problems such as brain fatigue of subjects, reduction of accuracy of brain intention recognition, and low quality of experimental data

Active Publication Date: 2021-01-22
YANSHAN UNIV
View PDF7 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the rehabilitation of the motor nervous system, motor imagery is often used to induce EEG. However, if the experiment is performed under pure motor imagery for a long time, the subjects are prone to brain fatigue
The quality of the experimental data collected in this state is not high, which brings great pressure to the extraction of EEG signal features, and also reduces the accuracy of brain intention recognition

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Brain intention identification method and system based on brain-computer interface
  • Brain intention identification method and system based on brain-computer interface
  • Brain intention identification method and system based on brain-computer interface

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0093] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0094] The object of the present invention is to provide a method and system for recognizing brain intentions based on a brain-computer interface, so as to improve the accuracy of recognizing brain intentions.

[0095] In order to make the above objects, features and advantages of the present invention more comprehensible, the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a brain intention identification method based on a brain-computer interface. The method comprises the following steps: constructing an actual task model by utilizing MATLAB; performing an experiment and electroencephalogram information collection based on the actual task model to obtain an electroencephalogram original data set; performing data preprocessing on a plurality of electroencephalogram data in the electroencephalogram original data set to obtain a feature extraction matrix; inputting the electroencephalogram signal feature extraction matrix and a correspondinglabel into an extreme learning machine to obtain an extreme learning model; and inputting to-be-predicted electroencephalogram data into the extreme learning model to obtain a classification result.The accuracy of identifying the brain intention is improved. The features of the electroencephalogram signals collected by the method are more obvious after common spatial pattern feature extraction,and classification and identification are easier. Besides, the model trained by the extreme learning machine is used as a classifier so that accuracy is high, tedious iterative computation is not needed in the classification process, speed is higher, and effect is better.

Description

technical field [0001] The invention relates to the technical field of biological signal processing and machine learning, in particular to a method and system for recognizing brain intentions based on a brain-computer interface. Background technique [0002] Brain-computer interface analyzes brain intentions by extracting scalp EEG signals, and then evaluates brain activity, which is of great significance for solving the medical rehabilitation problems of patients with movement disorders. [0003] In recent years, experts and scholars at home and abroad have carried out a series of research on brain-computer interface. There are three key steps in the research on brain intention recognition: designing reasonable experimental tasks, extracting features of EEG signals, and classifying EEG data. Among them, the establishment of a reasonable experimental task model is the first prerequisite for EEG signal extraction. In the study of brain-computer interface, experts at home and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): A61B5/372
CPCA61B5/7267
Inventor 付荣荣米瑞甫王世伟
Owner YANSHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products