Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body action recognition method and system based on deep learning

A human action recognition and deep learning technology, applied in the field of deep learning, can solve the problems of shallow extraction network, low recognition accuracy, and no consideration of the temporal relationship of features, etc., to achieve good classification results and high recognition accuracy

Active Publication Date: 2019-08-16
CHANGSHA UNIVERSITY
View PDF27 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In view of the above defects or improvement needs of the prior art, the present invention provides a human action recognition method and system based on deep learning. The timing relationship between features will lead to technical problems that the extracted features are not conducive to classification and low recognition accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body action recognition method and system based on deep learning
  • Human body action recognition method and system based on deep learning
  • Human body action recognition method and system based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0047] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0048] Such as figure 1 As shown, the human action recognition method based on deep learning of the present invention includes: acquiring two consecutive frames of images in the video sequence, and inputting the two consecutive frames of images into the trained human action recognition model to obtain the human action recognition result, wherein The human action recognition model is generated t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human body action recognition method based on deep learning. The method comprises the steps of obtaining two continuous frames of images in a video sequence, inputting the two continuous frames of images into a trained human body action recognition model to obtain a human body action recognition result, wherein the human body action recognition model is generated throughthe following steps: obtaining two continuous frames of images in the video sequence in the data set, extracting an optical flow image from two acquired continuous frames of images by utilizing an optical flow extraction method; and repeatedly executing the process on all the remaining frames in the video sequence, averagely dividing the video sequence and the optical flow image sequence into T segments, extracting a single-frame image from each segment of the video sequence, and extracting continuous L frames of optical flow images from each segment of the optical flow image sequence. According to the method, the technical problems that the extracted features are not beneficial to classification and the recognition accuracy is low due to the fact that the depth of the extracted network issmall and the time sequence relation between the features is not considered in an existing human body action recognition method can be solved.

Description

technical field [0001] The invention belongs to the technical field of deep learning, and more specifically, relates to a human action recognition method and system based on deep learning. Background technique [0002] The traditional human motion recognition is to add acquisition devices such as biosensors or mechanical sensors to the human body. It is a contact motion detection method, which will bring disgust or fatigue to people. With the development of technology, this recognition mode has been gradually replaced by image-based recognition methods. [0003] The introduction of deep learning has made breakthroughs in machine learning, and has also brought a new development direction for human action recognition. Different from traditional recognition methods, deep learning can automatically learn high-level features from low-level features, which solves the problem that feature selection is too dependent on the task itself and the adjustment process takes a long time. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/08G06V40/23G06V20/42G06N3/045
Inventor 李方敏刘新华彭小兵旷海兰黄志坚杨志邦阳超
Owner CHANGSHA UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products