Data-driven real-time hand action evaluation method based on RGB video

A hand movement, data-driven technology, applied to computer components, instruments, biological neural network models, etc., to achieve the effect of improving accuracy and robustness, improving computing efficiency, and improving efficiency

Pending Publication Date: 2020-07-28
SHANGHAI JIAO TONG UNIV +1
View PDF7 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Aiming at the research gap of hand motion evaluation in the field of human behavior analysis based on video processing, the problem of corresponding error between features and real physical parts in human motion evaluation, and the real-time problem of motion evaluation system, a real-time hand motion based on RGB video is proposed. Recognition and evaluation methods to solve the matching accuracy between the extracted features

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data-driven real-time hand action evaluation method based on RGB video
  • Data-driven real-time hand action evaluation method based on RGB video
  • Data-driven real-time hand action evaluation method based on RGB video

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0063] An image containing depth information (also known as image depth) refers to the number of bits used to store each pixel, and it is also used to measure the color resolution of an image. It determines the number of colors that each pixel of a color image may have, or determines the number of gray levels that each pixel of a grayscale image may have, which determines the maximum number of colors that may appear in a color image, or a grayscale image The maximum gray level in . While the pixel depth or image depth can be very deep, the color depth of various display devices is limited. For example, the standard VGA supports 4-bit 16-color color images, and at least 8-bit 256 colors are recommended for multimedia applications. Due to the limitation of equipment and the limitation of human eye resolution, in general, it is not necessary to pursue a ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a data-driven real-time hand action evaluation method based on an RGB video, and belongs to the field of human behavior analysis based on video processing. The method comprisesa hand posture estimation unit and an action evaluation unit. The hand posture estimation unit is used for extracting hand key point coordinates from the frame of images; the action evaluation unit is used for predicting scores of hand action quality and giving suggestions on how to improve the scores; gesture estimation and organization are carried out through a method based on deep learning, and action quality evaluation is carried out. The matching accuracy of the extracted features and the details of the human hand in the real scene in the continuous changing process of the visual angle of the camera is improved. The calculation/operation efficiency of overall action recognition and evaluation can be improved, real-time virtual reconstruction of hand actions is achieved, real-time andaccurate evaluation can be conducted on the human hand actions, and the accuracy and robustness of overall action evaluation are improved. The invention can be widely applied to the fields of vision-based hand posture estimation, action quality evaluation methods and the like.

Description

technical field [0001] The invention belongs to the field of human behavior analysis based on video processing, in particular to a real-time evaluation method for hand movements based on RGB video. Background technique [0002] The rapid development of the field of computer vision in recent years has led to many reliable methods for object detection and action recognition from images and videos. To this end, the academic community has gradually begun to explore the field of video-based human motion quality assessment. [0003] At present, a lot of progress has been made in the macroscopic movement of the human body. [0004] In the paper "Assessing the Quality of Actions" (Hamed Pirsiavas, Carl von Derik and Antonio Toraba, European Conference on Computer Vision 2014. Springer International Press, 556-571. HamedPirsiavash, Carl Vondrick, and Antonio Torralba. 2014. Assessing the Quality of Actions. In 2014 European Conference on Computer Vision (ECCV). Springer Internation...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/34G06K9/62G06N3/04
CPCG06V40/28G06V20/40G06V10/267G06N3/044G06N3/045G06F18/2411Y02A90/10
Inventor 李冕王天予王毅杰
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products