Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Joint learning method of label and interaction relation for human action recognition

A behavior and labeling technology, applied in the field of human behavior recognition, can solve problems such as unrecognizable interactive behavior, non-convex, and images that cannot apply to multiple behavior categories

Active Publication Date: 2020-08-18
ZHEJIANG UNIV OF TECH
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to overcome the shortcomings of existing human behavior recognition methods that cannot be applied to images of multiple behavior categories, cannot recognize interactive behaviors, and interact as a hidden variable that leads to non-convex training problems, the present invention provides a human behavior recognition method The label and interaction relationship joint learning method, suitable for images containing multiple behavior categories, can jointly learn a new training framework of interaction and individual behavior without using hidden variables, and proposes an effective solution to the corresponding reasoning problem algorithm

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Joint learning method of label and interaction relation for human action recognition
  • Joint learning method of label and interaction relation for human action recognition
  • Joint learning method of label and interaction relation for human action recognition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] The present invention will be further described below.

[0032] A method for joint label interaction learning for human action recognition, comprising the following steps:

[0033] 1) Construct energy function

[0034] Let G = (V, E) denote a graph, where the node set V represents the individual actions of all people, and the edge set E represents their interaction information, e.g., e ij ∈E indicates that there is an interaction between person i and person j, while edge e st The absence of means that there is no interaction between person s and person t, I represents an image, is the personal behavior label of person i, a=[a i ] i=1,...,n is a vector containing individual behavior labels of n individuals;

[0035] Given a new input I, the goal is to predict the personal behavior label a and the interaction information G by solving the following problem (1);

[0036]

[0037] in

[0038]

[0039] in is an indicator function, if a i =s, its value is 1, o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A label and interactive relationship joint learning method for human behavior recognition, comprising the following steps: 1), using CNN features, HOG features, HOF features, distances between people, head orientation and other information to construct energy functions, Contains unary energy items, binary energy items, interaction energy items, and regularization items; 2) use large-interval structured learning to train all model parameters; 3) label and interaction relationship prediction, use alternate search strategies to solve complex reasoning problems, and iteratively Alternately optimize labels and interaction structures in . The invention is applicable to images and videos containing multiple people and multiple behavior categories, and can identify individual behaviors and interaction behaviors between people at the same time.

Description

technical field [0001] The invention belongs to the field of behavior recognition in computer vision, and relates to a human behavior recognition method. The invention judges human interactions while recognizing individual behavior. Background technique [0002] Recognizing human actions in images or videos is a fundamental problem in computer vision, which is crucial in many applications such as motion video analysis, surveillance systems, and video retrieval. In recent work, deep learning has significantly improved the performance of action recognition. However, these works are not suitable for dealing with data containing multi-person interactions. First, they focus on assigning each image an action label, which is not suitable for images containing multiple action categories. Second, they ignore that the interrelationships between people provide important contextual information for recognizing complex human activities like handshakes, fights, and football games. [0...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/20G06F18/2411G06F18/214
Inventor 王振华金佳丽刘盛张剑华陈胜勇
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products