Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Stable vision-induced brain-computer interface-based robot control method

A steady-state visual evoked, brain-computer interface technology, applied in the field of robotics

Active Publication Date: 2014-06-11
BEIJING UNIV OF TECH
View PDF10 Cites 40 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the problems existing in the prior art that the brain-computer interface device is complex, requires a lot of training, and cannot extract features effectively, the present invention provides a robot control method based on the steady-state visual evoked potential brain-computer interface. Steady-state visual evoked potentials evoked by flickering stimuli solve the complex problems of brain-computer interface hardware devices, and at the same time combine Independent Component Analysis (ICA) and Hilbert Huang Transform (HHT) in the process of feature extraction , so that the features of the EEG signal can be effectively extracted

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Stable vision-induced brain-computer interface-based robot control method
  • Stable vision-induced brain-computer interface-based robot control method
  • Stable vision-induced brain-computer interface-based robot control method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0072] The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.

[0073] The method flow chart of this embodiment is as follows figure 1 As shown, it specifically includes the following steps:

[0074] 1. The subjects were required to have normal vision or normal vision after correction. Subjects were seated in a comfortable chair approximately 65 cm from the computer monitor. Place electrodes according to the international 10-20 system electrode placement standard. Brain electrodes were placed at P3, PZ, P4, PO3, POZ, PO4, O1, OZ, and O2 in the occipital area of ​​the subject's head. The ear was used as the reference electrode, the ground electrode was grounded, and conductive paste was injected into the subject's brain electrode. , the purpose is to ensure that the electrode impedance remains below 5kΩ, so that the experiment is more accurate.

[0075] 2. Presentation of stimulus paradigm. Use the Psyc...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a stable vision-induced brain-computer interface-based robot control method. The method comprises the steps of firstly preprocessing an electroencephalogram signal obtained by actual collection with bandpass filtering; secondly, performing fast independent component analysis on the electroencephalogram signal obtained by preprocessing to obtain independent components; then using a Hilbert-Huang transform to dissolve the independent components to obtain an intrinsic mode function; performing spectrum analysis on the intrinsic mode function to obtain the needed features; finally using a threshold value judging method to classify the extracted features, and translating the classifying result into signals capable of being identified by a robot, thus realizing the real-time control on the robot. The robot control method is based on a stable vision-induced brain-computer interface, is high in transmission rate and is simple in equipment and device. The independent component analysis and Hilbert-Huang transform are combined in the feature extracting process, so the feature extracting is more effective. The limb-free action control on the motion of the robot is realized, and the severely paralyzed disabled with normal brain functions can control the robot to assist the disabled in normal living.

Description

technical field [0001] The invention relates to the field of robots, in particular to a robot control method based on a steady-state visually induced brain-computer interface. Background technique [0002] The Brian-Computer Interface (BCI) is a communication control system that does not depend on the normal output channels of the peripheral nerves and muscles of the brain. It establishes a direct communication channel between the brain and the outside world by collecting and analyzing the bioelectrical signals of the human brain, so that people can express their wishes or manipulate electronic equipment such as computers, speech synthesizers, auxiliary appliances, neural prostheses, and robots through the brain. The ultimate goal of brain-computer interface research is to design and implement a new type of assistive device for the disabled based on EEG signals to help disabled patients with movement disorders communicate better with the outside world. Therefore, research i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): A61F4/00
Inventor 阮晓钢薛坤黄静
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products