Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Brain-Computer Interface-Based Automated Assistance Method for Robotic Arms

A technology of brain-computer interface and robotic arm, which is applied in the field of brain-computer interface application research, can solve the problems of not fully utilizing the characteristics and advantages of brain-computer interface and robotic arm, so as to improve the ability of independent living, reduce burden, and facilitate Applied effect

Active Publication Date: 2019-11-15
SOUTH CHINA UNIV OF TECH
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The invention patents mentioned above only complete some simple or even preset mechanical arm movement control through EEG signals, and do not fully utilize the characteristics and advantages of the combination of brain-computer interface and robotic arm autonomous control technology. The autonomous auxiliary system and method of the mechanical arm of the interface can integrate the advantages of both the brain-computer interface and the mechanical arm, and better use the brain-computer interface to improve the quality of life of paralyzed patients and improve their ability to live independently

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Brain-Computer Interface-Based Automated Assistance Method for Robotic Arms
  • A Brain-Computer Interface-Based Automated Assistance Method for Robotic Arms
  • A Brain-Computer Interface-Based Automated Assistance Method for Robotic Arms

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0046] Such as figure 1 As shown, this embodiment provides a brain-computer interface-based robotic arm autonomous assistance system. The system is built according to the three-layer structure of the perception layer, decision-making layer, and execution layer. The perception layer includes EEG acquisition and detection modules and visual recognition and a positioning module, the EEG acquisition and detection module is used to collect EEG signals, analyze and identify user intentions, and the visual recognition and positioning module is used to identify and locate the corresponding cup and the position of the user's mouth according to the user intention position; the execution layer includes a manipulator control module, which is a carrier for assisting people in actual operations, and performs trajectory planning and control on the manipulator according to the execution instructions received from the decision-making module; the decision-making layer includes A decision-making...

Embodiment 2

[0052] This embodiment provides a brain-computer interface-based robotic arm autonomous assistance method, such as figure 2 As shown, the method includes the following steps:

[0053] 1) The user sits in front of the first computer screen, adjusts the position, wears the electrode cap for EEG acquisition, turns on the EEG acquisition instrument and the first computer, and confirms that the signal acquisition is in good condition;

[0054] 2) Start the autonomous auxiliary system of the robotic arm based on the brain-computer interface, confirm that the Microsoft Kinect visual sensor that recognizes and locates the user's mouth can correctly capture the user's mouth, and confirm that the three preset cups to be grabbed are correctly placed Within the field of view of the Microsoft Kinect vision sensor used to identify and locate the cup to be grabbed;

[0055] 3) The first computer screen enters a flickering visually stimulating function key interface, which includes four fun...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a brain-computer interface-based robotic arm autonomous assistance system and method. The system includes a perception layer, a decision-making layer, and an execution layer. The perception layer includes an EEG collection and detection module and a visual recognition and positioning module for Analyze and identify the user's intention, and use it to identify and locate the corresponding cup and the position of the user's mouth according to the user's intention; the execution layer includes the robotic arm control module, which plans the trajectory of the robotic arm according to the execution instructions received from the decision-making module. Control; the decision-making layer includes a decision-making module, which is used to connect the EEG acquisition and detection module, the visual recognition and positioning module, and the manipulator control module to realize the collection and transmission of EEG signals, positioning positions and state data of the manipulator, and the manipulator executes instructions sent. The invention combines visual recognition and positioning technology, a brain-computer interface and a mechanical arm to provide convenience for paralyzed patients to drink water independently and improve the quality of life of the paralyzed patients.

Description

technical field [0001] The invention relates to the field of brain-computer interface application research, in particular to a brain-computer interface-based robotic arm autonomous assistance system and method. Background technique [0002] There are many severely paralyzed patients in the world who can only complete some activities necessary for daily life, such as drinking water, with the help of others. With the continuous development of artificial intelligence and robotics, more and more research results have been applied to assist such people in order to improve their quality of life. A branch with rapid development and broad prospects has aroused a wave of research in the field of brain-computer interface. [0003] Brain-computer interface (BCI) is a brand-new human-computer interaction technology, which can directly realize the communication between the human brain and the computer without going through the conventional brain output pathways (peripheral nerves and mu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): B25J9/16G06F3/01A61B5/375
CPCG06F3/015B25J9/1664B25J9/1682G06F2203/011B25J13/00G06F3/013G06F3/0482G06F3/0304B25J9/1689G05B2219/40413A61B5/375B25J3/04B25J13/087G05B19/4097G05B2219/35444G05B2219/39451H04W80/06
Inventor 张智军黄永前李远清
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products