Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Zero training sample behavior identification method

A technology for training samples and behaviors, applied in character and pattern recognition, computer parts, instruments, etc., can solve problems such as inability to accurately describe behaviors and actions, and achieve the effect of improving robustness

Active Publication Date: 2013-11-20
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF4 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

But in real life, binary attributes cannot accurately describe behaviors

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Zero training sample behavior identification method
  • Zero training sample behavior identification method
  • Zero training sample behavior identification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0016] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be described in further detail below in conjunction with specific embodiments and with reference to the accompanying drawings.

[0017] figure 1 is a flow chart of the zero training sample behavior recognition method proposed by the present invention, such as figure 1 As shown, the method includes the following steps:

[0018] Step S1, extracting the feature vector of each action video sample in the video sample library;

[0019] Said step S1 further comprises the following steps:

[0020] Step S11, utilizing three-dimensional corner feature (Harris3D) to extract a plurality of spatiotemporal interest points for each action video sample in the video sample library;

[0021] Step S12, extracting a gradient histogram (histogram of oriented gradients, HOG) and an optical flow histogram (histogram of optical flow, HOF) around each extracted spatio-t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a zero training sample behavior identification method which comprises the following steps that a characteristic vector of each action video sample is extracted; a plurality of human motion attributes and a relation between action video pairs under each human motion attribute are set; the relations between the action video pairs serve as an input, and are trained by a ranking support vector machine; by using an output ranking score, each category of human behaviors with a training sample is fit, and a Gaussian mixture model is obtained; a Gaussian mixture model of the categories of the zero training sample human behaviors is acquired by transfer learning; a characteristic vector of a test video sample is extracted; and the categories of the zero training sample human behaviors in the test video sample are judged by using a maximum posterior probability principle. By using the fit ranking score of the Gaussian mixture model, a purpose of behavior identification is achieved, and the categories of the behaviors are judged by maximum posterior, so that the robustness of the behavior identification is improved.

Description

technical field [0001] The invention belongs to the technical field of intelligent video monitoring, and in particular relates to a zero training sample behavior recognition method. Background technique [0002] Behavior recognition plays an important role in video surveillance. It can recognize the behavior of human body in the video, and contribute to dangerous behavior alarm and specific behavior recognition. The simplest and most effective method in behavior recognition is the method based on the bag-of-words (BOW) model. This method first extracts the features of the video, then clusters all the features, and then according to each video feature The frequency of occurrence at the cluster centers is histogrammed. But a shortcoming of this method is that it does not take spatio-temporal characteristics into account. Zhang et al. used a semantic-based linear coding method to not only consider the spatio-temporal relationship between features but also reduce the reconstru...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/66G06K9/00
Inventor 王春恒张重肖柏华刘爽周文
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products