Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human Action Recognition Method Based on Structured Feature Map

A recognition method and feature map technology, applied in the field of human behavior recognition of feature maps, can solve problems such as less involvement and neglect of local feature space-time correlations

Inactive Publication Date: 2018-08-31
BEIJING JIAOTONG UNIV +1
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although the method of local features + word bag model is simple and effective and does not need to track the human body, this method ignores the spatial-temporal correlation of local features
In addition, at present, the research on human behavior recognition technology mainly focuses on simple single-person behavior recognition, while the analysis of more practical multi-person interactive behavior is rarely involved.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human Action Recognition Method Based on Structured Feature Map
  • Human Action Recognition Method Based on Structured Feature Map
  • Human Action Recognition Method Based on Structured Feature Map

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] Those skilled in the art will understand that unless otherwise stated, the singular forms "a", "an", "said" and "the" used herein may also include plural forms. It should be further understood that the word "comprising" used in the description of the present invention refers to the presence of said features, integers, steps, operations, elements and / or components, but does not exclude the presence or addition of one or more other features, Integers, steps, operations, elements, components, and / or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Additionally, "connected" or "coupled" as used herein may include wirelessly connected or coupled. As used herein, the term "and / or" includes any and all combinations of one or more of the associated listed items.

[0027] Those skilled in the ar...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human behavior recognition method based on a structuralized characteristic pattern. The method comprises the steps that a space-time interest point is extracted from a sample, and a characteristic vector is generated; the characteristic vector is set as an initial value, parameter training of a hidden condition random field model is completed by utilizing a preset function, and an element with intermediate semantic information and a spatial connecting relation of the element are learned; the sample is divided into timing sequence units, characteristic patterns are established inside the timing sequence units, and the characteristic patterns are mapped in a characteristic space; the timing sequence units are connected through a timing sequence relation, a chain-shaped graph is established, and a timing sequence graph kernel is brought up; the timing sequence graph kernel is utilized, and to-be-recognized behavior video is classified and recognized. By adopting an HCRF to complete the learning on the element and the incidence relation of the element, by using the characteristic patterns to complete the description of a video sequence, and by using the timing sequence graph kernel to complete the matching of the characteristic patterns in space and time, behavior sequences of different lengths and velocities can be matched, and the method can be used for the behavior of an individual and also be used for the interactive behaviors of many people.

Description

technical field [0001] The invention belongs to the technical field of visual information processing, and in particular relates to a human behavior recognition method based on a structured feature map. Background technique [0002] Now is an era of information explosion. Information of various contents, forms, and carriers comes one after another, filling our lives, and more than 80% of the information people get is visual information. Faced with such a huge amount of information, it is increasingly unable to meet the needs only by manually processing information. Therefore, it is urgent to research and develop computer-related capabilities that can replace manpower. Human behavior analysis based on vision is a very active research field. At the same time, human behavior recognition has extensive application value in human-computer interaction, intelligent monitoring, sports performance analysis, content retrieval and other fields, and has become a popular research field of ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00
Inventor 苗振江许万茹张强刘汝杰
Owner BEIJING JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products