Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Fuzzy boundary fragmentation-based depth motion map human body action recognition method

A technology of blurring boundaries and actions, applied in character and pattern recognition, instruments, computer parts, etc., can solve the problem of missing action time information, and achieve the effect of improving classification accuracy, efficient segmentation, and improving robustness.

Active Publication Date: 2017-03-22
慧镕电子系统工程股份有限公司
View PDF3 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method accumulates all video frames on a DMM and loses the time information of the action

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fuzzy boundary fragmentation-based depth motion map human body action recognition method
  • Fuzzy boundary fragmentation-based depth motion map human body action recognition method
  • Fuzzy boundary fragmentation-based depth motion map human body action recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] In order to better illustrate the purpose, specific steps and characteristics of the present invention, the present invention will be further described in detail below in conjunction with the accompanying drawings, taking the MSRAction3D data set as an example:

[0037] The human behavior recognition method of the depth motion map of the fuzzy boundary segmentation proposed by the present invention, wherein the flow chart of feature extraction is as follows figure 1 shown. Firstly, the sample is divided into equal segments to determine the boundary, and then the blurring degree of the boundary is determined according to the parameter α. For each segmented video sub-sequence, its depth motion map DMM is calculated, and all The DMM of the sample is fixed to the same size and normalized, and the features of the subsequence are obtained after serial vectorization, and the construction of the output feature of the training sample is completed.

[0038] A human behavior reco...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention discloses a fuzzy boundary fragmentation-based depth motion map human body action recognition method. The model training method includes the following steps that: a video depth map sequence is fragmented, and the fuzzy boundaries of fragments are determined according to a fuzzy parameter alpha; the depth motion maps (DMM) of the front view, left view and top view of each sub-sequence obtained after the fragmentation are calculated; the depth motion maps are converted into maps of a fixed size through an interpolation method, and the maps are normalized; the normalized depth motion maps (DMM) of each sub-sequence of the video sequence are cascaded, and the feature vectors of the video sequence are obtained; and a robust probabilistic collaborative representation based classifier (R-ProCRC) is adopted to classify the features, and human body action recognition is realized. With the human body action recognition method of the invention, the change rule of time-domain features is effectively captured, the anti-interference ability of action features to time-domain difference is enhanced, and the robust recognition of human body actions can be realized.

Description

[0001] Technical field: [0002] The invention belongs to the field of machine vision, and in particular relates to a human behavior recognition method based on a fuzzy boundary slice in a deep motion map. [0003] Background technique: [0004] Human behavior recognition technology is a technology that extracts behavioral features from video sequences of human behaviors and recognizes actions through these features. [0005] In the field of machine vision and pattern recognition, human behavior recognition has now become a very active branch. This technology has many potential applications in the field of human-computer interaction, including video analysis, surveillance systems, intelligent robots, etc. In the past few years, research on human behavior recognition was mainly based on image frame sequences collected by color cameras[1][2]. This kind of data has inherent defects, and they are very sensitive to illumination, occlusion, and complex backgrounds, which affect reco...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/20G06F18/214G06F18/24
Inventor 蒋敏金科孔军昝宝锋胡珂杰徐海洋刘天山
Owner 慧镕电子系统工程股份有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products