Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human action recognition method based on deep motion map (DMM) generated by motion history point cloud (MHPC)

A point cloud generation and motion history technology, applied in the fields of computer vision and image processing, can solve problems such as high computational complexity and cumbersome algorithms, and achieve the effects of reducing computational complexity, increasing robustness, and increasing numbers

Inactive Publication Date: 2018-09-25
CIVIL AVIATION UNIV OF CHINA
View PDF8 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this method extracts the features of the point cloud, the algorithm is too cumbersome and the calculation complexity is high.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human action recognition method based on deep motion map (DMM) generated by motion history point cloud (MHPC)
  • Human action recognition method based on deep motion map (DMM) generated by motion history point cloud (MHPC)
  • Human action recognition method based on deep motion map (DMM) generated by motion history point cloud (MHPC)

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The human behavior recognition method based on the depth motion map generated by the motion history point cloud provided by the present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0027] Such as figure 1 As shown, the human behavior recognition method based on the depth motion map generated by the motion history point cloud provided by the present invention includes the following steps carried out in order:

[0028] (1) Get the point cloud of each frame of the depth image by coordinate mapping the multi-frame depth image that has extracted the foreground in each human action sample, and then fill it into the MHPC until the depth images of all frames are traversed to get the point cloud of the action MHPC, to record the space and time information of the action;

[0029] The specific method is as follows:

[0030] The human action samples are selected from the MSR Action3D database. The depth ima...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human action recognition method based on a deep motion map (DMM) generated by a motion history point cloud (MHPC). The method comprises steps: the MHPC is generated; the DMMis generated; HOG feature vectors are extracted; an SVM classifier is trained and tested; and finally, the output of the SVM classifier is a human action classification result. The information of a human action at different view angles can be acquired, and the human action angle change robustness is enhanced. When the motion history point cloud projection is used to generate the deep motion map, coordinate normalization operation is carried out, and robustness on action intra-class differences is enhanced; and the HOG features extracted from the deep motion map generated by the motion historypoint cloud projection can effectively characterize human action classes, and the problem of complexity when the point cloud is used for feature extraction is solved.

Description

technical field [0001] The invention belongs to the technical field of computer vision and image processing, and in particular relates to a human behavior recognition method based on a depth motion map (DMM) generated by a motion history point cloud (MHPC). Background technique [0002] The research significance of human behavior recognition is mainly reflected in its practical value. It has a wide range of applications in the fields of intelligent video surveillance, video content retrieval, human motion analysis, auxiliary medical treatment, etc. Experts and scholars at home and abroad have conducted a lot of research on this . Most of the initial behavior recognition methods are based on traditional RGB information, and methods such as key postures, silhouettes, and spatiotemporal features of the human body have been produced. However, because RGB information is easily affected by factors such as illumination, camera angle, and background changes, action recognition stil...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/20G06V40/28G06F18/2411
Inventor 张良刘婷婷
Owner CIVIL AVIATION UNIV OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products