Unlock instant, AI-driven research and patent intelligence for your innovation.

Walking intention recognition method for walking training robot

A technology of walking training and recognition method, which is applied in the direction of equipment to help people walk, physical therapy, etc., can solve the problems of inaccurate judgment and poor recognition accuracy, and achieve the effect of improving recognition accuracy and avoiding dangerous situations

Active Publication Date: 2021-08-24
SHENYANG POLYTECHNIC UNIV
View PDF9 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The technical problem to be solved by the present invention is: aiming at the deficiencies of the existing technology, the present invention proposes a dynamic WTR walking intention recognition method considering the walking speed. Poor, and inaccurate judgment of walking intention without considering walking speed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Walking intention recognition method for walking training robot
  • Walking intention recognition method for walking training robot
  • Walking intention recognition method for walking training robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0073] The present invention will be described in more detail below in conjunction with the accompanying drawings.

[0074] The previous research results of our research group have realized static walking DIT recognition. Based on the distance-type fuzzy reasoning method, the user's forearm is used to judge the direction and intention of the robot's pressure, without considering the problem of the user's body swing during walking, that is, the dynamics of the walking direction. The problem significantly affects the accuracy of direction and intention recognition, causing users to still have the potential danger of falling while walking. On the other hand, the user's walking speed has not been considered, and the walking speed is also very important in the accurate implementation of walking training, which has important theoretical significance and practical value for realizing the intelligent training of robots. Therefore, the present invention first analyzes the confidence in...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a walking intention recognition method for a walking training robot, which comprises the following steps of: acquiring multiple groups of forearm pressure data when a user performs multi-direction walking training by virtue of WTR by using a pressure sensor, and calculating a mean value and a standard deviation of the pressure data in each direction; carrying out fuzzy processing on the measured data based on the mean value and the standard deviation, and establishing a membership function and a fuzzy rule of a direction intention recognition fuzzy antecedent; carrying out pre-reasoning and judgment of a static direction intention based on distance type fuzzy reasoning; introducing a confidence coefficient for the recognized static direction intention, setting a threshold value, and distinguishing a direction intention transition stage and body swing to achieve dynamic direction intention recognition; establishing a linear relation between the speed intention and the walking frequency, fusing information measured by a pressure sensor based on a Gaussian distribution function, and providing optimal estimation of SIT recognition of the subject. The method solves the problems of poor recognition precision caused by body swing of the user and inaccurate judgment of the walking intention without considering the walking speed in the existing method.

Description

technical field [0001] The invention relates to the technical field of a walking training robot (WTR), in particular to a walking intention recognition method of a WTR. Background technique [0002] With the advent of an aging society, the elderly population is increasing rapidly; in addition, the number of disabled people is increasing, and the demand for intelligent rehabilitation equipment in hospitals and families is increasing. Therefore, countries all over the world attach great importance to the research of rehabilitation medical robots. Because walking is very important for people's quality of life and health maintenance, WTR has become an important aspect of medical rehabilitation robots. In the daily life of patients with lower limb dysfunction and the rehabilitation of lower limbs, WTR can greatly reduce the burden on patients and physical therapists. It is widely used to increase the training effect and duration. [0003] When WTR performs passive training tasks...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): A61H3/04
CPCA61H3/04A61H2003/046A61H2201/5071A61H2201/5007
Inventor 王义娜郑依伦刘中亮杨俊友白殿春孙柏青熊文秋
Owner SHENYANG POLYTECHNIC UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More