Infrared behavior identification method based on adaptive fusion of artificial design feature and depth learning feature

A technology of deep learning and recognition methods, applied in the field of image processing and computer vision, can solve the problem of loss of optical video surveillance, and achieve the effect of improving reliability

Active Publication Date: 2016-07-20
CHONGQING UNIV OF POSTS & TELECOMM
View PDF4 Cites 69 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] However, at present, there are relatively few researches on behavior recognition based on infrared video. In video surveillance work, if

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Infrared behavior identification method based on adaptive fusion of artificial design feature and depth learning feature
  • Infrared behavior identification method based on adaptive fusion of artificial design feature and depth learning feature
  • Infrared behavior identification method based on adaptive fusion of artificial design feature and depth learning feature

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings.

[0040] In the present invention, the feature module is artificially designed to extract the improved dense trajectory feature from the original video, and the extracted features are encoded; the improved dense trajectory feature is added gray value weight to the descriptor of the original dense trajectory information, the improved dense trajectory feature mainly embodies the spatiotemporal information of the video image sequence, and highlights the foreground motion information of the image sequence; the CNN feature module uses a variational optical flow algorithm to extract the optical flow information from the original infrared video image sequence, forming For the optical flow image sequence, the extracted optical flow graph is used as the input of the convolutional neural network, and the features of the fully connected layer of the conv...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to an infrared behavior identification method based on adaptive fusion of an artificial design feature and a depth learning feature. The method comprises: S1, improved dense track feature extraction is carried out on an original video by using an artificial design feature module; S2, feature coding is carried out on the extracted artificial design feature; S3, with a CNN feature module, optic flow information extraction is carried out on an original video image sequence by using a variation optic flow algorithm, thereby obtaining a corresponding optic flow image sequence; S4, CNN feature extraction is carried out on the optic flow sequence obtained at the S3 by using a convolutional neural network; and S5, a data set is divided into a training set and a testing set; and weight learning is carried out on the training set data by using a weight optimization network, weight fusion is carried out on probability outputs of a CNN feature classification network and an artificial design feature classification network by using the learned weight, an optimal weight is obtained based on a comparison identification result, and then the optimal weight is applied to testing set data classification. According to the method, a novel feature fusion way is provided; and reliability of behavior identification in an infrared video is improved. Therefore, the method has the great significance in a follow-up video analysis.

Description

technical field [0001] The invention belongs to the technical field of image processing and computer vision, and relates to an infrared behavior recognition method based on adaptive fusion of artificial design features and deep learning features. Background technique [0002] In recent years, behavior recognition in images and videos has become an important task in the field of computer vision. Behavior recognition in videos is of great significance to video surveillance, video information retrieval, and human-computer interaction. As various behavior recognition algorithms continue to refresh the recognition accuracy of various public data sets, the behavior recognition task in videos has made great progress. However, most of the current datasets are based on visible light video, and there are relatively few behavior recognition works based on infrared video. [0003] The current mainstream behavior recognition algorithms mainly involve two types of descriptors: artificial...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62G06N3/08
CPCG06N3/084G06V40/20G06V20/40G06V20/46G06F18/254G06F18/24
Inventor 高陈强吕静杜银和汪澜刘江
Owner CHONGQING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products