Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human action feature extraction method based on global salient edge region

A technology of edge areas and human movements, applied in the field of video analysis, can solve problems such as spending a lot of time on labeling

Inactive Publication Date: 2019-01-18
WUHAN UNIV
View PDF2 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantage of this method is that it takes a lot of time to label video samples with human experience, or rely on smart somatosensory devices to calibrate bone joint points

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human action feature extraction method based on global salient edge region
  • Human action feature extraction method based on global salient edge region
  • Human action feature extraction method based on global salient edge region

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0059] In order to facilitate those of ordinary skill in the art to understand and implement the present invention, the present invention will be described in further detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the implementation examples described here are only for illustration and explanation of the present invention, and are not intended to limit this invention.

[0060] see figure 1 , a method for extracting human motion features based on a global salient edge region provided by an embodiment of the present invention, specifically includes the following steps:

[0061] Step 1: Reduce the number of colors in the RGB color space and smooth the salience of the color space. The specific implementation process is: define the kth pixel I in the image I k The significance S(·) of is:

[0062]

[0063] where D(I k , I i ) for pixel I k and pixel I i A distance metric in color space. In this application, all S(...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human body action feature extraction method based on a global salient edge region, which uses the contrast of a region and the entire image to calculate the salient degree; reduces the number of colors in the color space, and smoothes the salient degree of the color space; according to the spatial relationship of adjacent regions Calculate the salient area; perform morphological gradient changes on the foreground area segmented by the binarization threshold to generate a global salient edge area; traverse the strong corners of all grids at different scales in the video frame; collect optical flow amplitudes in the salient edge area The key feature points that are not zero; calculate the displacement of strong corner points according to the corrected optical flow field; use the continuous multi-frame coordinate displacement trajectory of strong corner points and the neighborhood gradient vector to form the local spatiotemporal characteristics of human motion. The present invention extracts action features through the global significant edge area, eliminates background noise points irrelevant to human movement, eliminates the influence of camera movement on optical flow calculation, improves the accuracy of local spatiotemporal feature description of human action, and improves the recognition rate of human action.

Description

technical field [0001] The invention belongs to the field of video analysis, and relates to a method for automatic recognition of human behavior, in particular to a method for extracting human motion features based on a global salient edge region. Background technique [0002] With the continuous development of the Internet and the continuous promotion of video surveillance systems, the amount of video data has increased dramatically. Facing the massive video data, how to analyze video human behavior has become an urgent problem to be solved. Because video data is easily affected by unclear foreground motion areas, large camera shakes, and complex scene environments, there are a large number of noise corners in human motion in video data, resulting in inaccurate extraction of key feature points in video frames, and human behavior recognition. Precision is limited. [0003] Human action feature extraction is an important part of human action recognition and belongs to an im...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/46G06T7/11G06T7/155G06T7/136G06T7/194
CPCG06T2207/20036G06V40/20G06V10/44
Inventor 胡瑞敏徐增敏陈军陈华锋李红阳王中元郑淇吴华王晓周立国
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products