Human body posture visual identification method of transfer carrying nursing robot

A human body posture and visual recognition technology, applied in the field of human body posture visual recognition, can solve the problems of unrecognized or misrecognized, troublesome, large amount of algorithm calculation, etc., to reduce the dependence on joint positions, adapt to the family environment, and protect people. The effect of machine safety

Inactive Publication Date: 2019-08-06
HEBEI UNIV OF TECH
View PDF8 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] At present, human body posture recognition is mostly contact recognition. The posture of the human body is estimated by sticking points. Inertial tracker to detect the rotation and extension of the patient's forearm and wrist; Chinese Dalian University of Technology patent 201611272616.X uses 12 data acquisition nodes placed in different positions of the human body to measure the movement posture of different human body; contact The measurement requires the patient to wear a variety of sensors or optical signs, which is not only troublesome, but also affects the movement of the patient, causing psychological discomfort to the patient, which is not conducive to the practical application of fully automatic detection
Non-contact human body posture measurement is mainly based on vision, and the color map human body posture recognition method PAF (PartAffinity Field) (Cao Z, Simon T, Wei S E, et al. Realtime Multi-Person 2D Pose Estimation using PartAffinity Fields[J].2016. ) realizes high-precision and high-reliability human pose recognition, but it can only provide 2D pixel coordinates, and the estimation can

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body posture visual identification method of transfer carrying nursing robot
  • Human body posture visual identification method of transfer carrying nursing robot
  • Human body posture visual identification method of transfer carrying nursing robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0055] Define human body posture, such as figure 2 As shown, there are a total of 15 joint points (the number of human body posture joint points can be set according to the requirements). The technical solution for realizing the close-range posture visual recognition of people is: make full use of RGBD information, and use the first neural network to estimate the human body joint pixels in the color image Coordinates to realize the adaptability of the human joint recognition algorithm to close-range human postures, and then use the second-level neural network based on the depth map and joint heat map to upgrade the dimension and optimize the accuracy of the joint 2D coordinate estimation of the first-level neural network. 3D human pose. The flow process of the human body joint recognition algorithm of the present invention is as follows: image 3 shown.

[0056] In order to realize the estimation of the human body posture in the color image, the present invention adopts the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human body posture visual identification method of a transfer carrying nursing robot. According to the method, a two-stage network (the first-stage neural network is PAF andthe second-stage neural network is improved ResNet) based on RGBD (color image RGB + depth image Depth) is utilized to realize close-range higher-precision identification; a ResNet network commonly used for classification is improved, and input and output structures of the ResNet network are modified, so that the ResNet network can solve the problem of human joint recognition, serves as a second-level neural network, and obtains higher precision compared with the existing method. According to the nearest neighbor insect following method, under the condition that only two joint coordinates aredepended on, the armpit contour of the human body is automatically tracked, the armpit contour is further repaired through the convex hull algorithm, the armpit center, namely the armpit point, is finally obtained, and by means of the armpit point recognition method, the dependence of the armpit point on the joint position is reduced, and meanwhile accuracy is improved.

Description

[0001] Technical field: [0002] The invention belongs to the technical field of nursing robots, and in particular relates to visual recognition of human body gestures for transferring and transporting nursing robots. [0003] Background technique: [0004] my country has entered an aging society. At present, there are more than 230 million people over the age of 60, and the degree of aging is becoming more and more serious. With the continuous progress of my country's economy and technology, the demand for intelligent transfer, transportation and nursing robots in the Chinese market is increasing. bigger and bigger. The intelligent perception and understanding of the environment is the key to the intelligentization of the transfer, handling and nursing robots, and human body posture recognition is a key to environmental perception. [0005] At present, human body posture recognition is mostly contact recognition. The posture of the human body is estimated by sticking points. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/10G06V40/28G06N3/045G06F18/24147
Inventor 陈梦倩李顺达郭士杰刘今越贾晓辉刘彦开
Owner HEBEI UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products