Behavior recognition method, device, equipment and medium based on lrcn network

A recognition method and behavioral technology, applied in character and pattern recognition, biological neural network models, instruments, etc., can solve the problem of high computational overhead and achieve the effect of reducing the amount of calculation

Active Publication Date: 2021-07-23
上海清微智能科技有限公司
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The embodiment of the present invention provides a behavior recognition method based on LRCN (Long-term Recursive Convolutional Network) network to solve the technical problem of large computational overhead in the prior art when performing behavior recognition based on LRCN network
[0009] The embodiment of the present invention also provides a behavior recognition device based on the LRCN network to solve the technical problem of large computational overhead in the prior art when performing behavior recognition based on the LRCN network

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Behavior recognition method, device, equipment and medium based on lrcn network
  • Behavior recognition method, device, equipment and medium based on lrcn network
  • Behavior recognition method, device, equipment and medium based on lrcn network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] In order to make the objects, technical solutions, and advantages of the present invention, the present invention will be further described in detail below in connection with the embodiments and drawings. Here, a schematic embodiment of the present invention will be described herein for explanation of the invention, but is not limited to the present invention.

[0024] from figure 1 The LRCN network structure shows that the inventor discovered that the convolution calculation is very large in the entire calculation process, and during the proceeds of the behavior of each video sequence, each picture of the input portion of the LRCN network is To enter a separate convolutional neural network for calculation, after 20 separate convolutional neural networks, the weight of the convolutional neural network of each time is different. However, in fact, the image information between adjacent frames has a large amount of redundancy, and the calculation directly to the original image...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Embodiments of the present invention provide a behavior recognition method, device, device, and readable storage medium based on an LRCN network, wherein the method includes: acquiring a video frame sequence to be recognized and a corresponding optical flow graph; The sequence and the corresponding optical flow graph are input into the long-term recursive convolutional network model to obtain the behavior category label of the video frame sequence to be identified, and each adjacent preset number of frames in the video frame sequence to be identified is input In the first convolutional neural network in the long-term recursive convolutional network model, input the optical flow graph corresponding to the preset number of frames into the second convolutional neural network in the long-term recurrent convolutional network model, the convolutional neural network Convolutional layer sharing is performed by data fusion for a preset number of frames and optical flow maps respectively. This scheme introduces sharing between convolutional layers, which reduces the large amount of redundancy in the image information between adjacent frames before performing behavior recognition, thereby helping to reduce the overall computational load of the network.

Description

Technical field [0001] The present invention relates to the field of behavior identification, and in particular, to a behavior identification method, apparatus, device, and readable storage medium based on an LRCN (long recursive web) network. Background technique [0002] Behavior identifies another specific example of the sequence learning task, which is the image sequence of the timing as an input. The purpose of behavioral identification is to identify one or more agents from a series of observations of agents and environmental conditions. Since the 1980s, this research area has attracted the attention of many computer sciences in many different applications and associations with many different research sectors, such as medical, human-computer interaction and sociology. [0003] Currently, the LRCN (CNN) of conjunctive neural network (CNN) and the circulating neural network (LSTM) network is currently applied to video sequences for behavioral recognition. Identification metho...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/46G06N3/04
CPCG06V20/40G06V10/44G06N3/045
Inventor 欧阳鹏尹首一李秀东王博
Owner 上海清微智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products